官术网_书友最值得收藏!

Recognizing human motion using KNN

Core Motion is an iOS framework that provides an API for inertial sensors of mobile devices. It also recognizes some user motion types, and stores them to the HealthKit database.

If you are not familiar with Core Motion API, please check the framework reference: https://developer.apple.com/reference/coremotion.

The code for this example can be found in the  Code/02DistanceBased/ MotionClassification folder of supplementary materials.

As per iOS 11 beta 2, the CMMotionActivity class includes the following types of motion:

  • Stationary
  • Walking
  • Running
  • Automotive
  • Cycling

Everything else falls into an unknown category or is recognized as one of the preceding. Core Motion doesn't provide a way to recognize custom motion types so we'll train our own classifier for this purpose. Unlike decision trees from the previous chapter, KNN will be trained on device end-to-end. It will also not be frozen inside Core ML because as we keep all the control on it, we'll be able to update it in the application runtime.

iOS devices have three types of motion sensors:

  • Gyroscope: This measures device orientation in space
  • Accelerometer: This measures device acceleration
  • Magnetometer or compass: This measures magnetism

They also have a barometer to detect elevation and some other sensors, but they are less relevant for our purposes. We will use an accelerometer data stream to train our KNN classifier and predict different motion types, like shaking a phone or squatting.

The following listing shows how to get updates from the accelerometer:

let manager = CMMotionManager() 
manager.accelerometerUpdateInterval = 0.1 
manager.startAccelerometerUpdates(to: OperationQueue.main) { (data: CMAccelerometerData?, error: Error?) in 
    if let acceleration = data?.acceleration { 
        print(acceleration.x, acceleration.y, acceleration.z) 
    } 
} 

The accelerometer APIs in Core Motion provide a time series of three-dimensional vectors, as shown in the following diagram:

Figure 3.7: Core Motion coordinate system for accelerometer and gyroscope

To train our classifier, we need some labeled data. As we don't have a ready dataset and motion signals can be very different from person to person, we are going to allow the user to add new samples and improve the model. In the interface, the user selects the type of motion he wants to record, and presses the Record button, as shown in the next screenshot. The application samples 25 acceleration vectors, takes the magnitude of each vector, and feeds them with the label of the selected motion type into the KNN classifier. The user records as many samples as he wants.

主站蜘蛛池模板: 黄骅市| 定远县| 桑日县| 香港| 玉树县| 全椒县| 麦盖提县| 临清市| 武山县| 康马县| 江门市| 灵武市| 海伦市| 增城市| 博罗县| 彝良县| 祥云县| 那曲县| 安康市| 惠水县| 斗六市| 瑞安市| 通州市| 大安市| 台南市| 涪陵区| 清流县| 大丰市| 安泽县| 尼勒克县| 罗江县| 运城市| 绥德县| 宁陵县| 永和县| 钦州市| 临江市| 边坝县| 哈尔滨市| 静乐县| 新巴尔虎左旗|