官术网_书友最值得收藏!

Recognizing human motion using KNN

Core Motion is an iOS framework that provides an API for inertial sensors of mobile devices. It also recognizes some user motion types, and stores them to the HealthKit database.

If you are not familiar with Core Motion API, please check the framework reference: https://developer.apple.com/reference/coremotion.

The code for this example can be found in the  Code/02DistanceBased/ MotionClassification folder of supplementary materials.

As per iOS 11 beta 2, the CMMotionActivity class includes the following types of motion:

  • Stationary
  • Walking
  • Running
  • Automotive
  • Cycling

Everything else falls into an unknown category or is recognized as one of the preceding. Core Motion doesn't provide a way to recognize custom motion types so we'll train our own classifier for this purpose. Unlike decision trees from the previous chapter, KNN will be trained on device end-to-end. It will also not be frozen inside Core ML because as we keep all the control on it, we'll be able to update it in the application runtime.

iOS devices have three types of motion sensors:

  • Gyroscope: This measures device orientation in space
  • Accelerometer: This measures device acceleration
  • Magnetometer or compass: This measures magnetism

They also have a barometer to detect elevation and some other sensors, but they are less relevant for our purposes. We will use an accelerometer data stream to train our KNN classifier and predict different motion types, like shaking a phone or squatting.

The following listing shows how to get updates from the accelerometer:

let manager = CMMotionManager() 
manager.accelerometerUpdateInterval = 0.1 
manager.startAccelerometerUpdates(to: OperationQueue.main) { (data: CMAccelerometerData?, error: Error?) in 
    if let acceleration = data?.acceleration { 
        print(acceleration.x, acceleration.y, acceleration.z) 
    } 
} 

The accelerometer APIs in Core Motion provide a time series of three-dimensional vectors, as shown in the following diagram:

Figure 3.7: Core Motion coordinate system for accelerometer and gyroscope

To train our classifier, we need some labeled data. As we don't have a ready dataset and motion signals can be very different from person to person, we are going to allow the user to add new samples and improve the model. In the interface, the user selects the type of motion he wants to record, and presses the Record button, as shown in the next screenshot. The application samples 25 acceleration vectors, takes the magnitude of each vector, and feeds them with the label of the selected motion type into the KNN classifier. The user records as many samples as he wants.

主站蜘蛛池模板: 六盘水市| 武乡县| 贵南县| 黄浦区| 东平县| 涡阳县| 三河市| 黄龙县| 永兴县| 罗源县| 右玉县| 武定县| 新津县| 嘉定区| 思茅市| 苏尼特右旗| 大埔县| 荔波县| 德化县| 扶风县| 庆元县| 渭源县| 分宜县| 昭通市| 乳源| 东乡县| 栾城县| 泗洪县| 盈江县| 博爱县| 延川县| 东城区| 朔州市| 乌兰浩特市| 沁阳市| 上高县| 乡宁县| 佛教| 德昌县| 蕲春县| 龙州县|