Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
It takes some time to become familiar with new APIs etc. But with a new product category such as AR/VR/spatial computing with so many new paradigms I would think that getting real hands on experience early is vital.
Otherwise developers end up wasting a lot of time creating concepts that ultimately don’t work well in real world use. The visionOS simulator is no substitute for actually wearing a headset.
The fact that developers aren’t grabbing every chance to get hands on with the hardware doesn’t sound great to me. I suppose the main question is is that because Apple are not handling developer access well enough (I’ve seen a lot of complaints about the limited locations of the labs), or is it because developers are not interested in the platform.
It takes some time to become familiar with new APIs etc. But with a new product category such as AR/VR/spatial computing with so many new paradigms I would think that getting real hands on experience early is vital.
Otherwise developers end up wasting a lot of time creating concepts that ultimately don’t work well in real world use. The visionOS simulator is no substitute for actually wearing a headset.
The fact that developers aren’t grabbing every chance to get hands on with the hardware doesn’t sound great to me. I suppose the main question is is that because Apple are not handling developer access well enough (I’ve seen a lot of complaints about the limited locations of the labs), or is it because developers are not interested in the platform.