augmented reality - CMMotionData to SceneKit SCNNode orientation -
Trying to use CoreMotion to rotate a sneaker camera properly, the scene I created is not easy ... everything I do, makes a bunch of boxes distributed in one area, and the camera just points to the Z axis.
Unfortunately, the data coming up does not seem to relate back to the device's physical location and orientation in any way related to the device. It seems randomly redundant
As I have suggested, I'm going through the orientation property of the camera quadrant directly on the camera node.
Do I have a misconception that what the data core proposal is giving me? Should not the attitude reflect the physical orientation of the device? Or is it a incremental movement and should I build on pre-orientation?
Here this piece can help you:
motionManager = CMMotionManager () motionManager .deviceMotionUpdateInterval = 1.0 / 60.0 motionManager .startDeviceMotionUpdatesToQueue (NSOperationQueue.mainQueue (), withHandler: {(Speed: CMDeviceMotion !, Error: NSError) -> Go to zero currentAttitude = motion.attitude on roll = float Pitch = float (currentAttitude.pitch) on deviation = float (currentAttitude.roll) + (0.5 * float (M_PI)) = (floatingAttitude.pitch) self.cameraNode.eulerAngles = SCNVector3 (X: -roll, Wi: deviation) , Z: -pitch)}) This setting is in the scenario for the device you + and - + < P> Import coremotion to play with different tilt.
Comments
Post a Comment