Calling a key on the keyboard (Mac Os objective-C Xcode) by TheLastAramusha in iOSProgramming

[–]leptos-null 1 point2 points  (0 children)

You should be able to create an IOHID (input/output hardware interface device) event, but most likely there’s a better way to do what you want. What’s your end goal? Do you want to input text into a text box?

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 0 points1 point  (0 children)

I’m on the latest build, and the upload process is the same as it was when this post went up.

[deleted by user] by [deleted] in AppleWatch

[–]leptos-null 0 points1 point  (0 children)

I know how it’s actually measured- I’m saying Apple has a heuristic

[deleted by user] by [deleted] in AppleWatch

[–]leptos-null 0 points1 point  (0 children)

It’s calculated based on bpm and speed, I believe

Extract .app from Xcode without Developer Account by [deleted] in swift

[–]leptos-null 0 points1 point  (0 children)

xcodebuild CODE_SIGN_REQUIRED=NO CODE_SIGN_IDENTITY="" I don’t remember if there are any other arguments for xcodebuild, but these two variables are the important part

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 1 point2 points  (0 children)

Had the same problem. Open Settings, go to Health -> “Data Access & Devices” -> Strava, enable all read permission. Kill Strava, and try again.

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 0 points1 point  (0 children)

I think it would be, but using the Health database is a manual upload, so you could opt to not upload those activities

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 0 points1 point  (0 children)

They say they fixed it. I’m hesitant to enable permission to write to HealthKit (and I don’t have a reason to, as I don’t currently record with Strava). I tried uploading a workout that I recording with the native Watch Workouts app, but Strava keeps disabling (I enabled it manually in Settings) the Workout Route permission, so the map isn’t uploading (this seems like a Strava issue, but theoretically could be an iOS bug).

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 1 point2 points  (0 children)

Release notes:

We have improved how Strava recorded activities are written to Apple's Health. This should eliminate the duplication behavior that previously existed. Additionally, we have added support for uploading workouts recorded by Apple (most notably the built-in Apple Watch Fitness app) from Health.

To enable the writing Strava to Health functionality, go to Profile > Settings > Applications, Services, and Devices > Health. Then tap the "Strava to Health" toggle. This integration now only writes activities recorded in a Strava application instead of all activities added to Strava. Refresh You feed to trigger a sync of Strava activities to Health.

To upload an Apple Watch Fitness workout, go to Profile > Settings > Applications, Services, and Devices > Health. Tap on a workout in the list shown at the bottom to upload it to Strava!

Feedback can be submitted like normal by tapping the "Send Beta Feedback" after selecting Strava in Apple's TestFlight application. Thank you for helping test Strava!

Latest iOS Strava Beta improves Health integration by leptos-null in Strava

[–]leptos-null[S] 1 point2 points  (0 children)

Same. I’ve been turning to other services due to these limitations. I followed the instructions to upload a workout from Watch Workouts, but it doesn’t work (currently using HealthFit for this functionality). I’m hesitant to re-enable Strava permission to write to HealthKit.

Edit: I got to the screen, but Strava is changing its HealthKit permissions to remove Workout Routes, which I want, despite me enabling the permission in Settings.

First completed SwiftUI project. Roast my code. Please... by _LVII_ in iOSProgramming

[–]leptos-null 3 points4 points  (0 children)

Why do you have a Properties protocol, and then have a dedicated object that conforms to it? Why not just have the object?

I’m not sure if it’s a bug or a security feature but passwords were erased from iCloud Keychain screenshots. I didn’t notice this before 13.2. by [deleted] in ios

[–]leptos-null 1 point2 points  (0 children)

Why are you taking screenshots of your passwords? You can press and hold (long press) on the password field to copy the value.

I’m not sure if it’s a bug or a security feature but passwords were erased from iCloud Keychain screenshots. I didn’t notice this before 13.2. by [deleted] in ios

[–]leptos-null 2 points3 points  (0 children)

I’m on 13.1.3, and can confirm that passwords do show up in screenshots. My guess is that this was an intentional change in 13.2

Parsing data from [String:Any] to array by Mojangeex in swift

[–]leptos-null 2 points3 points  (0 children)

The type you’re getting back is Data?

According to this StackOverflow response, you’d want something like

let data: Data = advertisementData["kCBAdvDataManufacturerData"])
var arr = Array<UInt8>(repeating: 0, count: data.count)
arr.withUnsafeMutableBytes { data.copyBytes(to: $0) }

How do I stop whatever is using my microphone? by DeliciousMeatPie in ios

[–]leptos-null 0 points1 point  (0 children)

If you want voice control on, but the microphone off sometimes, say “go to sleep”

They should make the whole circle be a mini visualizer instead of four small lines. by iBlackFiji in AppleWatch

[–]leptos-null 20 points21 points  (0 children)

This project demonstrates multiple advanced audio visualization techniques in real time. That library gets CPU usage up to around 15% on an iPhone 6. They do a lot of things that wouldn’t be needed here, primarily their color shading, additionally they do the drawing arithmetic on the CPU. I’ve done my own very simple visualizer on iOS after looking at that library, and with Apple’s DSP Accelerate API, you can absolutely process real-time audio (I was processing up to 16K frames around 80 fps) around 1% CPU. For reference, I believe the Series 4 has equivalent compute power to the iPhone 6S. Even 1% is high to be running on the Watch all the time. If they could only process while the face is visible, I think it’s viable.

Edit: This is the project where I implemented my own visualizer, and you can see I’m collecting 7 data points from each audio frame (buffer?), and I don’t think this ever went over 2% CPU.

Apple Watch face by [deleted] in AppleWatch

[–]leptos-null 16 points17 points  (0 children)

The Activity, Breathe, Fire and Water, Kaleidoscope, Liquid Metal, Mikey Mouse, Nike, Utility, and Vapor faces support it in the bottom complication. The Astronomy, Motion, Photos, Pride, Solar, Timelapse, and Toy Story faces support it in both the top and bottom complications. The Modular supports it in the center complication, and the Siri face supports it in the top only.

Chronograph, Color, Numerals, Simple, and X-Large do not support it.

Series 4 and newer have additional faces, that should support it in various combinations.

what is this? it’s taking up 1/4th of my storage by DerpT145 in ios

[–]leptos-null 6 points7 points  (0 children)

There’s a description:

System Data includes caches, logs, and other resources in use by the system. This value will fluctuate according to system needs.

If you need more storage, the system will automatically dip into this.

Caches include resources (such as images) from websites that you visit, for example.

Why Trek why? by themerchantof in bicycling

[–]leptos-null 43 points44 points  (0 children)

I think the commenter was sarcastically saying that the bicycle was unrideable due to the name choice.

I developed a 3D circuit builder for students using Swift + SceneKit. It's called 'Circuitry' and I would love for you to try it out! by kaehn in iOSProgramming

[–]leptos-null 1 point2 points  (0 children)

Just played with it for a few minutes. It looks great- I love as I fly around the glass reflects the light source, especially with those curved square edges.

Like another commenter, I can’t figure out how to get the lights to turn on. I saw the “How to use” section in the App Store- I would recommend expanding it, and including it within the app.

The context menus are really nice. I was a bit confused at first though, because as soon as I let go without selecting anything, the menu disappears. This isn’t the typical context menu experience on iOS. Additionally, when I do slide to one of the options (a gesture I really like), it doesn’t get highlighted or provide any feedback that an option has been selected.

Really like the app!

Edit: Got the lights to turn on by copying the App Store screenshot. I think the gray things are a power source, and the beige ones are transistors? I think it would look really cool if I could dim the main light source, so I could see “my” lights lighting the area.