Posted by: Yanyan | September 7, 2013

iOS version Seattle coming

Just a quick announcement here, but an exciting one. An iOS version of Seattle will be coming soon. After that we’ll also have our sensor app ready to roll on iOS!

Advertisements
Posted by: Yanyan | August 2, 2013

Name Change: Sensibility Testbed

I’m glad to announce the name of our project changed to¬†Sensibility Testbed. Somewhat makes me think of the book “Sense and Sensibility” by Jane Austen ūüôā

Project page is set up here http://sensibilitytestbed.poly.edu/ We are excited to have a few new people joining in.

Posted by: Yanyan | June 30, 2013

Follow Up: Accelerometer Data Processing

A friend asked me a few questions on an earlier post on¬†Accelerometer Sensor Data Processing. I think it’s probably better explained by a follow up post.¬†First, how to get the acceleration value. Here is a bit more detail than the previous post. In this method

public class AccelerationEventListener implements SensorEventListener {

@Override
public void onSensorChanged(SensorEvent event) {
float[] values = event.values.clone();

}

}

values[] is an array of 3 floats: values[0], values[1] and¬†values[2]—the acceleration plus¬†gx/gy/gz on the x/y/z-axis respectively. See the description on this link. The reason why there’s an extra component of g is because¬†the force of gravity is always influencing the measured acceleration. To measure the real acceleration of the device, the gravity portion must be eliminated. The link above gives an example of how to isolate the force of gravity using an inverse low pass filer (equivalent to a high pass filter), to filter out the constant down-ward gravity component of the accelerometer data.

private float[] highPass(float x, float y, float z) {

float[] filteredValues = new float[3];

gravity[0] = ALPHA * gravity[0] + (1 ‚Äď ALPHA) * x;
gravity[1] = ALPHA * gravity[1] + (1 ‚Äď ALPHA) * y;
gravity[2] = ALPHA * gravity[2] + (1 ‚Äď ALPHA) * z;

filteredValues[0] = x ‚Äď gravity[0];
filteredValues[1] = y ‚Äď gravity[1];
filteredValues[2] = z ‚Äď gravity[2];

return filteredValues;

}

To see how this works, you call highPass() every time the sensor value is changed:

public void onSensorChanged(SensorEvent event) {

values = highPass(values[0], values[1], values[2]);

}

By doing so, array gravity[] is just a helper array. It gets updated every time acceleration value is changed, and it’s a very frequent change.

Finally, the difference between accelerometer and¬†Sensor.TYPE_LINEAR_ACCELERATION is this. Linear¬†acceleration sensor is a synthetic sensor, with¬†gravity filtered out. Here let me quote the book “Professional Android Sensor Programming”:

From Android 2.3 onward, for convenience, developers also have the synthetic sensors Sensor. TYPE_GRAVITY and Sensor.TYPE_LINEAR_ACCELERATION available. These sensors factor out the force due to gravity and other accelerations.

This a very good book BTW!

Posted by: Yanyan | June 5, 2013

SeattleOnAndoid and SeattleSensors update

Since the last two posts of Seattle on Android in November, there has been some updates for both Seattle on Android and Seattle Sensors.

First, Gaetano has bundled Seattle install together with Python for¬†Android. Now it’s only one click on your phone, instead of installing Seattle and Py4A separately. The latest Seattle can be downloaded from Google App Store here. Once installed, there will be a link to the available Android sensors, which are provided by our second app Seattle Sensors. This app can be found and installed from here.¬†The following steps are needed to set up Python and Repy (restricted Python) for running code on Seattle VMs.

Python: When you install Android SDK, it comes with a tool called ADB (Android debug bridge). The executable adb is located in your android-sdk/platform-tools. One way to find your android-sdk path is by looking at Window -> Android SDK manager in Eclipse. It can be found at SDK path. ADB allows you to communicate with an emulator instance or connected Android device from your desktop or laptop. To enable shell access to the Python interpreter on your phone, first create a script as follows:

#!/system/bin/sh

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/data/data/com.seattletestbed/files/python/lib
export PYTHONPATH=/mnt/sdcard/com.seattletestbed/extras/python:/data/data/com.seattletestbed/files/python/lib/python2.7:/data/data/com.seattletestbed/files/python/lib/python2.7/lib-dynload
export PYTHONHOME=/data/data/com.seattletestbed/files/python
export TEMP=$TEMP:/mnt/sdcard/com.seattletestbed/extras/tmp/
export PATH=$PATH:/data/data/com.seattletestbed/files/python/bin

Name it exports.sh, and change its permission to 755. Then connect your phone or tablet with a USB cable, and do

adb push exports.sh /data/local/tmp

Note that the critical part is the path /data/local/tmp if your phone is not rooted. Otherwise the executable permission of exports.sh (755) will not be preserved after you push it onto the phone, because of the noexec option set. For example, I once pushed my script to /sdcard/, its permission then changed to 664. Some directories are even not writeable. After this, do the following to access your device from your command line:

adb shell

shell@android:/ $ . /data/local/tmp/exports.sh

If you now type python and see the greeting message, then your set up is successful.

Repy. Repy is the restricted version of Python Seattle supports. To prepare running any Repy code, do:

  1. Download a restrictions file such as this one
  2. Upload the restrictions file and your Repy code to your Android device, e.g. using adb push restriction_file, repy_code /sdcard/sl4a/seattle/seattle_repy 

Meanwhile, to allow Repy collect data from your phone, you have to get Seattle Sensor running. Seattle and Sensor communicate with each other via XML-RPC. Now you can run python repy.py restrictions.test your_script.repy from the command line!

Posted by: Yanyan | May 12, 2013

Measuring Air Pressure

In an earlier post by Albert, we saw PressureNet that uses barometer data on smartphones. On Android, getting ambient air pressure is quite simple. The following code is from Android developers page.

public class BarometerSensor implements SensorEventListener {
private SensorManager mSensorManager;
private Sensor mPressure;

@Override
public final void onCreate(Bundle savedInstanceState) {

mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mPressure = mSensorManager.getDefaultSensor(Sensor.TYPE_PRESSURE);

}

@Override
public final void onSensorChanged(SensorEvent event) {

float millibars_of_pressure = event.values[0];

}

}

As simple as it looks, event.values[0] is the value of air pressure. Further, we can get device altitude via the pressure value:

float altitude = getAltitude(SensorManager.PRESSURE_STANDARD_ATMOSPHERE, millibars_of_pressure);

The advantage of using air pressure value to calculate altitude rather than using GPS is that, GPS is also not a viable option for use inside a building. And in many cases, the elevation measured using GPS is inaccurate. However, when the elevation is calculated using the atmospheric pressure sensor, we can calculate the elevation with higher accuracy by using the atmospheric pressure information and sea level atmospheric pressure information. Moreover, we can immediately show small changes in the elevation.

Posted by: Yanyan | April 11, 2013

Remote Execution of SQL in Android Javascript

Using the previous approach (PhoneGap/Cordova APIs), app developers can run a native wrapper of HTML5/JavaScript on various mobile platforms. I recently explored the option of doing remote execution of SQL in HTML5/JavaScript, using Android as an example.

The app first finds the current location of the phone, regardless of using WiFi, GPS, cellular, etc. Then given this location, it queries a remote database to find the closest bus stop. In this app, there are a few places where asynchronous calls happen. First, to get the geolocation of a phone, we do

navigator.geolocation.getCurrentPosition(onSuccess, onError, { enableHighAccuracy: true });

onSuccsess and onError are the first callback functions here, and they are called when navigator.geolocation.getCurrentPosition() returns. If the geolocation is correctly returned, then function onSuccsess() will be called, otherwise onError():

function onError(error) {

alert(‘getCurrentPosition error…code: ‘ + error.code + ‘\n’ +¬†‘message: ‘ + error.message);

}

function onSuccess(position) {

lat = position.coords.latitude;
lon = position.coords.longitude;
closest_stop(function(stop){

// find the closest stop from array stop

}

}

Note that the option { enableHighAccuracy: true } is only needed for Android. It enables the device to use GPS as the location source. In the above callback function onSuccsess(), we use local variables lat and lon to store the latitude and longitude of the current location. Then based on this location, we call another asynchronous function closest_stop().

function closest_stop(callback_func){

var stop = new Array();
var i = 0;
var query = “SELECT stop_lat, stop_lon, stop_id, stop_name FROM stops;”;
query.replace(/ /g, “%20”);
$.ajax({

url: ‘ http://cengiz.cs.uvic.ca:2080/info.php?query= ‘ + query,
dataType: ‘jsonp’,
success: function(data){

$.each(data, function(object){

$.each(data[object], function(values){

var entry = data[object][values];
stop[i] = new Array(entry.stop_lat, entry.stop_lon, entry.stop_id, entry.stop_name);
i++;

});

});
callback_func(stop);

},
fail: function(data){
console.log(“Error”);
}

});

}

In¬†closest_stop(callback_func), there is an ajax call that queries the remote database. If the execution of¬†“SELECT stop_lat, stop_lon, stop_id, stop_name FROM stops;” is successful, it then stores the returned data in an array stop. In our database, the number of stops is quite high, so it takes a while to get all the records back from the server. The¬†asynchronous function callback_func(stop) will only be called when all the returned data is ready. When onSuccess() calls¬†closest_stop(function(stop){…}), the parameter of¬†closest_stop is actually¬†callback_func(stop), where we can use the array of stops to calculate which one is the closest to the phone’s current location.

Posted by: Yanyan | March 6, 2013

PhoneGap/Cordova Again

In my earlier post this January, I talked about cross platform programming on mobile devices. In the past week, I had the pleasure to work on PhoneGap again.

First, how it works. On Android, to set up PhoneGap, the step by step instructions can be found here. All the platform independent code, ie, HTML and JavaScript, will be in assets/www. index.html in assets/www will be the main entry point for your PhoneGap application’s interface, just like an Android main activity.¬†To use Cordova library, cordova-xxx.js and cordova-xxx.jar need to go to the right place, and build the library path accordingly. Meanwhile, in the main activity class, you need to import org.apache.cordova.DroidGap. In onCreate() method, replace setContentView() with super.loadUrl(“file:///android_asset/www/index.html”).

index.html then will load your JavaScript code that performs certain functionalities by calling PhoneGap APIs. For example, to get geolocation in JavaScript code, call navigator.geolocation.getCurrentPosition(onSuccess, onError). This basically does the same thing as native Android code using GPS or network location service to get the current latitude and longitude of your device.

Posted by: aaaaalbert | February 8, 2013

In other news: The PressureNet barometer network

In case you, dear reader, wonder why we are so much into interfacing all of the sensors of our devices and share the data, check out PressureNet (available for Android). This app collects location and barometer data on smartphone/tablet devices like ours, and will hopefully help to improve the quality of weather prediction based on this. Similar to (though more coasre-grained than) what we are doing in Sensorium, the PressureNetizens worry about the privacy of the device owners, and let you opt out of publishing your data.

At the time of writing this post, the PressureNet guys didn’t have a policy for sharing or publicly accessing the data yet. I’m curious how this develops!

Posted by: Yanyan | January 20, 2013

Native vs. HTML5 Apps, or Somewhere in Between

Seattle Sensor is a native Android app. By native, the code and apps will be specific to a certain mobile platform, like Xcode and Objective-C with iOS, Eclipse and Java with Android. However, we want Seattle to run across multiple platforms. Supposedly, deploying Seattle on Android, iOS, windows phone etc, each platform needs a different version of code.

So there are HTML5 apps as an alternative, which¬†use standard web technologies—typically HTML5, JavaScript and CSS. HTML5 apps are device agnostic and can be opened with any modern mobile browser. Apps are therefore easier to write, as the technology bar is lower. This write-once-run-anywhere approach to mobile development creates cross-platform mobile applications that work on multiple devices. This solution sounds¬†promising, but according to¬†Mark Zuckerberg, Facebook’s biggest mistake was using HTML5 for mobile development. The biggest problem is, AFAIK, speed and functionality. Here is a table comparing native apps and HTML5 apps, and their hybrid.

Hybrid will then be somewhere in between—a web app, primarily built using HTML5 and JavaScript, is then wrapped inside a thin native container that provides access to native platform features. PhoneGap, under Apache Cordova project, is an example of the most popular container for creating hybrid mobile apps. PhoneGap provides APIs that enables programmers to access native operating system functionality using JavaScript. Programmers build app logic using JavaScript, and the PhoneGap API handles communication with the native operating system.¬†We may think of such kind of app as a chrome-less web browser. It renders HTML content, without the chrome or window decoration of a regular web browser.¬†The web view used by PhoneGap is the same web view used by the native operating system. On iOS, this is the Objective-C UIWebView class; on Android, this is android.webkit.WebView. From the supported features, more functionalities are available through this bundle, such as camera,¬†accelerometer, contacts, etc.

From what I’ve done in the past week, IMO using PhoneGap/Cordova APIs is still very slow compare to native, even when the data is stored on the device. HTML5, CSS¬†and JavaScript together is platform independent, but there will be times when you find it painful to find a¬†PhoneGap plugin for a specific purpose (eg, I had problems with¬†PhoneGap SQLite plugin on Android). I guess this somewhere-in-between is not quite there yet, maybe?

Posted by: Yanyan | January 1, 2013

Accelerometer Sensor Data Processing

Happy New Year and Gong Hee Fot Choy!

In the past few days I have been playing with accelerometer sensor on my Galaxy Nexus. On MEMS sensors, acceleration is measured by attaching a mass to springs and seeing how far the mass deviates from its equilibrium position. It is then easy to see why an accelerometer in free-fall will report zero acceleration even though it is still subject to Earth’s gravity. Data is collected in 3 dimensions X/Y/Z. The default orientation for phones is portrait, but this is not true of most tablets. However, even for a device that has a default orientation of landscape, the axes will still be orientated as Y pointing up, X pointing to the right, and Z pointing out of the screen.

On Android, to collect data from any sensor, an app needs to register a SensorEventListener to receive sensor data, extract data from SensorEvent depending on the sensor type, and ensure that an app unregisters at the right time. It can be implemented like so:

public class AccelerationEventListener implements SensorEventListener {

@Override
public void onSensorChanged(SensorEvent event) {
float[] values = event.values.clone();

}

}

values[0] to values[2] will have the data from X, Y and Z axes. However, the raw sensor data can be erroneous, containing background noise, drift, and of course Z value has the extra g m/s^2. I plotted the value of accelerometer when the phone is lying on the desk. The result is very obvious.

Screenshot_2013-01-01-14-02-46

First, to mitigate some errors and spikes in data,¬†in many cases an app may¬†rely on some form of smoothing or averaging, also known as low-pass filtering (it filters out high-frequency noise and ‚Äúpasses‚ÄĚ low-frequency or slowly varying changes). Very much the same way in signal processing in Electrical Engineering. The simplest form of low-pass filter is by a weighted smoothing. Define a¬†smoothing parameter (or weighting value) a, a value from 0 to 1, such that¬†(New mean) = (Last value) * (1‚Äď a) + xi * a, and can be implemented as

float[] lowPass(float x, float y, float z) {

float[] filteredValues = new float[3];

filteredValues[0] = x * a + filteredValues[0] * (1.0f – a);
filteredValues[1] = y * a + filteredValues[1] * (1.0f – a);
filteredValues[2] = z * a + filteredValues[2] * (1.0f – a);

return filteredValues;

}

To be slightly more complicated, you can do a moving average of the last k values of sensor data as the filtered data. On the other hand, the simplest way perform high-pass filtering is to do a low-pass filter and then subtract the result from the sensor data. The following code is from Android doc to filter out the constant down-ward gravity component of the accelerometer data and keep the higher-frequency transient changes.

private float[] highPass(float x, float y, float z) {

float[] filteredValues = new float[3];

gravity[0] = ALPHA * gravity[0] + (1 – ALPHA) * x;
gravity[1] = ALPHA * gravity[1] + (1 – ALPHA) * y;
gravity[2] = ALPHA * gravity[2] + (1 – ALPHA) * z;

filteredValues[0] = x – gravity[0];
filteredValues[1] = y – gravity[1];
filteredValues[2] = z – gravity[2];

return filteredValues;

}

Of course, with the high and low-pass filter, there will be something to do them both, and this is bandpass filter. In its simplest incarnation, and in the form most useful for most Android sensor applications, it is simply a combination of a low-pass and high-pass filter. Data is first filtered to keep the higher-frequency components, and then the very high-frequency noise is filtered out with a low-pass smoothing filter. Some screenshots show the result of applying one of both of the filters.

Screenshot_2013-01-01-13-39-30 Screenshot_2013-01-01-13-39-57 Screenshot_2012-12-31-22-01-39

BTW, the Linear Acceleration Sensor is a synthetic sensor provided by Android which factors out the force due to gravity. It is available from Android 2.3 (API level 9).

« Newer Posts - Older Posts »

Categories