官术网_书友最值得收藏!

Recognizing human motion using KNN

Core Motion is an iOS framework that provides an API for inertial sensors of mobile devices. It also recognizes some user motion types, and stores them to the HealthKit database.

If you are not familiar with Core Motion API, please check the framework reference: https://developer.apple.com/reference/coremotion.

The code for this example can be found in the  Code/02DistanceBased/ MotionClassification folder of supplementary materials.

As per iOS 11 beta 2, the CMMotionActivity class includes the following types of motion:

  • Stationary
  • Walking
  • Running
  • Automotive
  • Cycling

Everything else falls into an unknown category or is recognized as one of the preceding. Core Motion doesn't provide a way to recognize custom motion types so we'll train our own classifier for this purpose. Unlike decision trees from the previous chapter, KNN will be trained on device end-to-end. It will also not be frozen inside Core ML because as we keep all the control on it, we'll be able to update it in the application runtime.

iOS devices have three types of motion sensors:

  • Gyroscope: This measures device orientation in space
  • Accelerometer: This measures device acceleration
  • Magnetometer or compass: This measures magnetism

They also have a barometer to detect elevation and some other sensors, but they are less relevant for our purposes. We will use an accelerometer data stream to train our KNN classifier and predict different motion types, like shaking a phone or squatting.

The following listing shows how to get updates from the accelerometer:

let manager = CMMotionManager() 
manager.accelerometerUpdateInterval = 0.1 
manager.startAccelerometerUpdates(to: OperationQueue.main) { (data: CMAccelerometerData?, error: Error?) in 
    if let acceleration = data?.acceleration { 
        print(acceleration.x, acceleration.y, acceleration.z) 
    } 
} 

The accelerometer APIs in Core Motion provide a time series of three-dimensional vectors, as shown in the following diagram:

Figure 3.7: Core Motion coordinate system for accelerometer and gyroscope

To train our classifier, we need some labeled data. As we don't have a ready dataset and motion signals can be very different from person to person, we are going to allow the user to add new samples and improve the model. In the interface, the user selects the type of motion he wants to record, and presses the Record button, as shown in the next screenshot. The application samples 25 acceleration vectors, takes the magnitude of each vector, and feeds them with the label of the selected motion type into the KNN classifier. The user records as many samples as he wants.

主站蜘蛛池模板: 图们市| 兖州市| 浙江省| 石嘴山市| 治县。| 通渭县| 南江县| 加查县| 金坛市| 正阳县| 汉源县| 亚东县| 沅江市| 汉源县| 台前县| 信阳市| 茂名市| 凤山县| 梁河县| 含山县| 嘉兴市| 清流县| 尉氏县| 张北县| 依安县| 兴海县| 金沙县| 育儿| 德安县| 仙桃市| 靖宇县| 科尔| 全州县| 二连浩特市| 娄烦县| 邵阳市| 航空| 原阳县| 顺昌县| 通渭县| 固安县|