This past week I prepared a new proposal. I wanted to do a similar location-based mobile app, preferably one that uses the accelerometer. The proposal may be found at:
https://www.dropbox.com/s/9lfrr7ocj1mpfnc/ProjectPropsal-KevinShen-location_tracking.pdf
I agree with Mike that it is best to push this project to the spring semester.
Augmented Reality GPS
Sunday, October 27, 2013
Sunday, October 20, 2013
Scaled back
After realizing my last idea was not feasible because it's impossible to have GPS accuracy within a only a few feet, I've decided to write a new proposal for a scaled back idea with a similar goal.
After reading more about the Google Location API (intro here, article describing some more capability here), a navigation app is still possible. Instead of drawing markers on a video feed, which requires accuracy within a few feet or computer vision work out of the scope of this project, I instead propose an app that alerts the user if they are deviating from the path to their intended destination.
This would be accomplished with the Location API's geofences, which allow an application to know if a user has crossed a threshold. Setting up multiple geofences on either side of a user's walking path and buzzing or playing a sound when the user crosses these fences would help the user stay in the right direction. I haven't played with the API enough to know the exact technical challenges, which will be outlined in a revised formal proposal.
This is especially useful for walking in urban areas at night, when stopping to use your phone for directions can be dangerous. This may also be useful for people with sight disabilities, or even normal users who are listening to music and prefer to hear walking directions through their headphones.
Sunday, October 13, 2013
Pivoting
Since the last post, I have implemented reading raw sensor data from the phone. This was demoed during alpha reviews this past Tuesday.
One major problem that came up during alpha review was that GPS may be inaccurate to the point where location can only be known within a 30 foot radius. This is solved by some mapping applications by using wifi networks to increase location accuracy. Such GPS localization is out of scope for this project, and Dr. Badler suggested looking at other apps based on data from the accelerometer and other hardware sensors.
Currently I'm considering other applications that use the accelerometer in a creative way to try to come up with a new idea. Health and wellness apps like Sleep as Android, which measures movement when you place it on the bed next to you as you sleep, seem like a very useful application (and more interesting than games).
Sunday, October 6, 2013
Navigation APIs
Unfortunately, I wasn't able to make much progress this week. In my proposal, I suggested OpenStreetMaps as a possible alternative to Google Maps. After more digging, OpenStreetMaps only gives raw map data and does not publish a navigation API, so I will use the Google Maps Navigation API to get navigation points.
Development of the Android app is at the same place it is last week, but I should be able to meet my alpha review goals in time for my review on Tuesday.
Development of the Android app is at the same place it is last week, but I should be able to meet my alpha review goals in time for my review on Tuesday.
Friday, September 27, 2013
Hello Android
Progress this week was geared towards ramping up the Android application. I started with reading the documentation linked in comments on my previous blog post by Mike. I also set up the Android environment and ran through several basic Android tutorials. Using what I learned from the documentation and tutorials, I have made a basic app that displays the camera feed on the screen of an Android phone.
Next steps will be to read raw orientation and acceleration data from hardware sensors. There should also be some time devoted to developing a UI and building functionality to allow querying of Google's Directions API to get navigation directions from the user's current position.
Next steps will be to read raw orientation and acceleration data from hardware sensors. There should also be some time devoted to developing a UI and building functionality to allow querying of Google's Directions API to get navigation directions from the user's current position.
Saturday, September 21, 2013
Initial commit
This is the first in a series of weekly posts for my DMD senior design project. My proposal for Augmented Reality Landmark Navigation and other senior projects can be found here.
The largest unknown in my proposal was which platform and phone to use. Since writing the proposal, I've decided to use Qualcomm's Vuforia SDK for Android. This decision was informed by a talk given by Serafin Diaz of Qualcomm Research during which the capabilities of Vuforia were demonstrated. In particular, Vuforia provides excellent object tracking, which may be necessary for calibration and ensuring the accuracy of where waypoints are rendered.
I chose to use Android as opposed to iOS because of my familiarity with Java.
Battery life is one concern that has come up from the Qualcomm talk and my readings about mobile development. Because AR applications are computationally intensive and battery life is extremely limited, this app should be optimized for conserving battery. One way to do this is with efficient data transfers, as described in this blog post. Another possibility is to use a phone with a Qualcomm processor in it, since the Vuforia SDK is optimized for these processors. A third goal for battery optimization is efficient memory usage - using RAM is power-expensive. These are all things to consider while I am developing this application, and should be outlined now as opposed to later, when I may have coded myself into a power-inefficient corner.
The proposal outlines a mobile app that helps users navigate street directions. The main feature of the app will be displaying directions with waypoint markers overlaid on a live video feed. The app will take advantage of the phone's GPS coordinates and internal orientation sensors to achieve this.
I chose to use Android as opposed to iOS because of my familiarity with Java.
Battery life is one concern that has come up from the Qualcomm talk and my readings about mobile development. Because AR applications are computationally intensive and battery life is extremely limited, this app should be optimized for conserving battery. One way to do this is with efficient data transfers, as described in this blog post. Another possibility is to use a phone with a Qualcomm processor in it, since the Vuforia SDK is optimized for these processors. A third goal for battery optimization is efficient memory usage - using RAM is power-expensive. These are all things to consider while I am developing this application, and should be outlined now as opposed to later, when I may have coded myself into a power-inefficient corner.
Subscribe to:
Posts (Atom)