Recently I was working on a project at work that involved augmented reality that specifically targeted the mining industry.
One of the requirements for this project was for portable devices to be used for the augmented reality.
We had already developed a proof of concept application for the Apple iPhone running iOS. This worked fine for the most part and there were few hiccups in the process.
We also wanted to target Google Android operating system devices as well. To do this, we needed to experiment with that platform.
Initially we chose to stick with something that we knew fairly well for platform development and that was the world of .NET which most of our products, Winforms and Web, leveraged.
The developer tried to get MonoDroid to utilize the picture preview handler to obtain data from the device which was in this instance, a mobile phone with two cameras, front and back. He'd run into problems before going on holidays.
With work being somewhat slow this time of year for us, I was tasked with experimenting with the native Java API for Android to see what I could get cooking. If all worked well, once the other developer returned from his holidays, I could just hand over the native implementation and let him sort out the MonoDroid side of things in .NET land.
I've had my personal Samsung Galaxy S for almost six months now and have also had Eclipse with the Android SDK installed on my development machine at home. I had yet to do anything native with the device at this point. This was going to be a learning experience that I was quite looking forward to; getting paid to do something that I love doing, writing code and learning!
I jumped into Eclipse and updated the Android SDK environment to begin the process. Having never written a line of Java in the past 25 years or so of software development, I was quite looking forward to what I was starting.
Google was my first port of call for a simple "hello world" Android native Java application. I'd found a couple I was happy with and began to understand the flow of things; I'm a quick study of new languages.
Now that a basic application was building and deploying to the Android simulator environment that I had set up, the next logical step for me was to get the application to display live preview images from the rear facing camera within the same Android simulator environment.
A bit of research suggests that this methodology is in fact supported. So I dive into the API documentation on the Android site and begin to cobble together something resembling what I
think will work. No surprise here, ran into a couple issues with a new language construct and have to rewire my thinking process. A few iterations and deployments to the Android simulator environment later and I finally have live simulated video from the camera being displayed within the application.
Now..onto the more difficult task of grabbing that video and saving out images as they're taken in near real time (as much as is feasible which I'll cover later).
I hook into the preview call back for the camera. This gets me the camera instance and the raw byte array of data that is fed from the camera. Excellent, this may be easier than I think!
Wrong! Android has a weird way of working in that you can't just set something without knowing what the capabilities of the something you are setting are. The camera is one of these somethings.
I believed that I sorted that out only to find out that image formats across vendor implementation and deployments of the Android operating system were not consistent. *sigh* I've read that before recently in an article from a software developer that produced games. Fine, I'll find the image format, which ended up being YUV and write my own decoder.
Well, seems this has been done before and I had just reinvented the wheel. Go me! I look into the specifics of the version of Android that our development devices had as well as my own personal phone. Jackpot! They've got 2.2update1 or better! This is a wonderful thing because these versions support YUV via the YuvImage class which can be converted to another format that I am interested in which is JPEG.
A bit more hacking on the code and I finally have the application now grabbing data from the preview call back for the camera and successfully saving out that data to what
should be JPEG files at a hard coded location within the file system.
A quick reconfiguration of the physical test device to gain access to the internal storage so that the images can be examined and previewed from the development machine results in a positive and expected outcome. I now have JPEG images that I can do something with.
With my task now complete, I rejoice by telling others in the office of how things went and how pleased I was with the results and the learning experience of a new language and platform that I hopefully will continue on with as time goes on.
The target of the experiment wasn't so much the saving of files, it was a convenient way of verifying results. When the developer returned from holidays, he was able to take the image byte array from the device and stream that from the device to a connected client browser via HTTP.
Mission accomplished.
The purpose of this article was to share an experience as well as some code that others may benefit from to help them learn from my experiences as I had from others that inspired the completeness of this application. I hope that I have achieved that purpose.
The code for the application follows:
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.List;
import android.app.Activity;
import android.content.res.Configuration;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
public class ImageTest extends Activity
{
private SurfaceView surfaceView = null;
private SurfaceHolder surfaceHolder = null;
private Camera camera;
private int imageCount = 0;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate( savedInstanceState );
setContentView( R.layout.main );
surfaceView = (SurfaceView) findViewById( R.id.SurfaceView01 );
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType( SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS );
surfaceHolder.addCallback( surfaceCallback );
Log.e( getLocalClassName(), "END: onCreate" );
}
@Override
public void onConfigurationChanged( Configuration newConfig )
{
super.onConfigurationChanged( newConfig );
}
SurfaceHolder.Callback surfaceCallback = new SurfaceHolder.Callback() {
public void surfaceCreated( SurfaceHolder holder )
{
camera = Camera.open();
imageCount = 0;
try {
camera.setPreviewDisplay( surfaceHolder );
} catch ( Throwable t )
{
Log.e( "surfaceCallback", "Exception in setPreviewDisplay()", t );
}
Log.e( getLocalClassName(), "END: surfaceCreated" );
}
public void surfaceChanged( SurfaceHolder holder, int format, int width, int height )
{
if ( camera != null )
{
camera.setPreviewCallback( new PreviewCallback() {
public void onPreviewFrame( byte[] data, Camera camera ) {
if ( camera != null )
{
Camera.Parameters parameters = camera.getParameters();
int imageFormat = parameters.getPreviewFormat();
Bitmap bitmap = null;
if ( imageFormat == ImageFormat.NV21 )
{
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
YuvImage yuvImage = new YuvImage( data, imageFormat, w, h, null );
Rect rect = new Rect( 0, 0, w, h );
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
yuvImage.compressToJpeg( rect, 100, outputStream );
bitmap = BitmapFactory.decodeByteArray( outputStream.toByteArray(), 0, outputStream.size() );
}
else if ( imageFormat == ImageFormat.JPEG || imageFormat == ImageFormat.RGB_565 )
{
bitmap = BitmapFactory.decodeByteArray( data, 0, data.length );
}
if ( bitmap != null )
{
FileOutputStream fileStream;
try {
String filePath = "/sdcard/image" + (imageCount++) + ".jpg";
File imageFile = new File( filePath );
fileStream = new FileOutputStream( imageFile );
bitmap.compress(Bitmap.CompressFormat.JPEG, 80, fileStream);
fileStream.flush();
fileStream.close();
} catch (FileNotFoundException e) {
Log.e( getLocalClassName(), e.toString() );
} catch (IOException e) {
Log.e( getLocalClassName(), e.toString() );
}
bitmap.recycle();
bitmap = null;
}
Log.e( getLocalClassName(), "onPreviewFrame" );
}
else
{
Log.e( getLocalClassName(), "Camera is null" );
}
}
});
Camera.Parameters parameters = camera.getParameters();
if ( parameters != null )
{
parameters.setPreviewFrameRate( 1 );
parameters.setPreviewSize( width, height );
camera.setParameters( parameters );
camera.startPreview();
}
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
if ( camera != null )
{
camera.stopPreview();
camera.release();
camera = null;
}
Log.e( getLocalClassName(), "END: surfaceDestroyed" );
}
};
}