Sign In

Remember Me

Interview with Dr. Marco Bitetto [updated with extra references]

Dr. Marco Bitetto is a disabled scientist developing a system employing sonification of synthetic aperture radar images for use by vision impaired and blind individuals. He is currently researching and publishing a paper on Interstellar Navigation System for near-light-speed Travel. His work is supported by a grant from a family endowment that supports the research of disabled scientists and engineers.   H+: Can you tell our readers a bit about your work and personal background? Dr. B: I have received a grant from ICRI in order to do the preliminary theoretical design of a navigation system for near light speed interstellar travel (this system takes into account the different effects that are explained in Einstein’s General and Special Theories of Relativity). Presently, I still an working on the completion of a book manuscript for the research subject of navigation at near light speed for interstellar travel.

In terms of my education, I have a Bachelor’s of Science in Computer Systems Engineering from the State University of New York and a Doctoral Degree in Robotics from the Union Institute. The area of doctoral research that my dissertation deals with is machines that learn by example.

H+: Can you tell me something about your medical condition and vision?
Dr. B: I am a legally blind diabetic that has only 5% residual vision. My vision disability was caused primarily by cortical blindness (brain damage to the visual cortex) and secondarily by diabetes.

 

H+: What led you to the idea to use software and Mathematica in particular to enhance your vision?
Dr. B: I thought of using Mathematica to manipulate x-bit translated images of “jpeg” and “bmp” based photographs so I can computationally process them as a series of sounds that are processed by an interface program that makes use of “wav” file format and is capable of being heard through the iTunes music/sound playing software (I did this by playing with the PCM based sound conversion algorithm used by typical video games and also such programs as garage band).

 

H+: What do you have working right now?
Dr. B: I have recently gotten a chance to revisit an idea that a supervisor and I have thought about some thirty three years ago. That is namely to construct a helmet based SAR vision system for the blind that either present the area around the user as a series of varying sonic tones and whistles (this is one variation of an implementation of the Laser Cane that is used by the blind). Another variation would produce a series of varying intensity vibrations on the hand/wrist (like another implementation of the Laser Cane). Both of these techniques are possible since, the vast majority of the components needed for the creation of a SAR vision system have become reasonably priced and readily available; As to has the technology for constructing a DYI based SAR vision system.
Currently, I only have a very crude simulation of what a ground based synthetic aperture radar image would look like (I still have to do more work on making the simulation even more realistic). Currently, the simulation takes a visible light – gray scale image and does a gray scale based color inversion and then does a gray-scale color filtering of the negative of the image. Finally, the negative of the image is then embossed using an edge detection method (the inverse Jaccobean). I still need to perfect a series of equations and algorithms to make a far more realistic transformation from a visual light based image to a simulated synthetic aperture radar image. [Editor suggests: Synthetic Aperture Radar Imaging Simulated in MATLAB]
This is my very first time in dealing with radio wave based vision systems. I have read extensively about the sonar and radar based systems; hence, from what I have read, the radar based systems have far fewer drawbacks over sonar based systems. In fact, all of the information that I have read about radar indicates that of all the currently implemented versions of radio wave based machine vision, synthetic aperture radar offers the most flexibility and is the only analog to sonar.
Currently, I am still reading more information about SAR and ISAR technology and I still am learning even more about these technologies. In particular how to make these technologies work to show a sound version of a panoramic view of the blind persons environment and also the ability to operate in a doppler mode (a mode that allows the blind person to perceive the velocity of targets approaching them).

 

H+: What’s next? Construct an actual SAR? I used to work on airborne X-band radars that included SAR modes.
Dr. B: The next step that I can foresee is building and programming a prototype system to operate as a synthetic aperture radar that is also capable of being operated in a doppler mode. MIT’s media lab sells as a kit and it basically is the collection of antennas, rotational platform, motors and a series of A/D to D/A converters that allow computer connections. They even offer some sample programs to process the images as a synthetic aperture radar system using a non-doppler method of computational processing.
The beauty of using SAR in the S, or, X bands (these are radar bands that are typically used by the military as apposed to the K,L,M bands that are used for weather forecasting) most of the technology has already been declassified and the SAR systems various computational algorithms are already extremely heavily studied and documented. With respect to computational power being high in the SAR arena of technology, it is quite common that many portable computers have multi-core processors that can range from four to six processors per mobile computer and are reasonably priced (just under $3K). Additionally, if the SAR system in question makes use of PCM encoded pulses, then the blind user can also get information about the types of material that the objects around them are made of (if I remember correctly from one of the declassified documents that I read on synthetic aperture radar). Furthermore, SAR (in the S, or, X bands) can see through a variety of temperatures (unlike that of SONAR, which must be constantly calibrated to changes in temperature and altitude) and also through smog, fog, heavy rain, and night time. Therefore, SAR and ISAR are the only truly practical ways to give the properly trained blind people “bat vision”.

 

H+: Are you aware of efforts such as northpaw? see http://sensebridge.net/projects/northpaw/
Dr. B: In terms of the NORTHPAW system, I have read the information that was provided at the link and it only gave the impression of being able to allow the blind to orient themselves with respect to the northern reference point. This is something that I really don’t see as a major improvement over the $100.00 talking GPS navigation systems and the cell phone based GPS talking navigation systems.

 

H+: What about laser canes?
Dr. B: With respect to the Velodyne LIDAR system being anything like a LASER/SONAR cane, I would rather stick with my guide dog… I’ve used both the LASER cane and the SONAR cane and have found them deeply deficient in its capabilities to compensate for lack of vision.
Not only is the LASER cane not able to scan a 360 degree area (it usually only scans a 90 degree area to the front of the blind user) it also usually uses a series of vibrating pins in the handle to let its user know about objects that are within a 3 meter area of the blind person. However, it does have a height based sweep function that lets the blind user know about low hanging obstacles in an auditory mode. The SONAR cane does work in a similar fashion.

 


 

Want to build your own SAR imaging system?
and watch this video:

2 Comments

  1. For those that want to learn more about Dr. Bitetto’s work experience, they are referred to the http://www.linkedin.com/pub/dr-marco-bitetto/39/54b/9aa
    link…