At 08:17 PM 6/23/99 , Thomas McGahee wrote:
>Matt,
>Your imaging device has only a single point of view, and that will limit
>greatly the "detail" you can extract from the echo information.

Part of the trade offs that I made in the design.  The more sensors that I
add, the more support circuitry and physical size I need.  I would like to
keep it somewhat small.

>Imagine TWO of your current units mounted, say a few feet apart and operated
>in sing-song fashion. First one sends/receives, then the other. Since each
>can independently sweep an overlapping zone, you can glean more info
>about this overlapping region.

This method is still limited by the focus of the transducer- my biggest
obstacle.

>Mode Two: Instead of sweeping circularly as you currently do, mount a
>single unit on a linear track so it can move back and forth a few feet.
>If you aim the unit straight ahead and move from one side of the track to
the other
>you will build up a sliding image that has many forward looking points of
view.
>Well, at least it's linear instead of curved...

This is a lot like what I used to do (in my day job).  For about 6 years I
worked on a Radar that rode along a track and was able to generate a
synthetic aperture radar (SAR) image.  This is actually one of the things
that I would like to work on, in addition to a phased array sonar that the
beam is steered electronically.  The ultrasonic transducers actually have
an advantage in this situation, since they have a very wide beam pattern,
but SAR has a few limitiations in this application- because the range to
the target must be on the order of the size of the array (since gain is
poor), the transformations from the "time" output of the sonar to the image
output is pretty nonlinear (if the range is >> aperture size, an FFT is a
really easy way to make an image), and you have to a whole lot of warping
of the image.  Another big difficulty is that when you are looking at an
object over a wide angle, you run into non-linearities because the return
off of most real objects is highly specular (varies greatly over incidence
angle), which makes it harder to form an image.  You also run into problems
that moving targets appear in the wrong place in the image because of
doppler shifts, and you need to sample at a higher rate with a lot of
precision, because you end up needing to do a backprojection type of image
where precise phase data of the incoming signal is vital to the quality of
the image.

I still want to create a sonar that uses an an array- say about 3 feet
wide, with an electronically steered antenna- that should give great
images, but it has a corresponding leap in complexity and expense.


>Extend the concept one more stage: Position your sonar transducers to the
>extreme left and aimed 45 degrees to the right. As you step the sensors
>down the track from left to right, keep adjusting the angle of view of the
>transducers so that they are always looking directly at a POINT a few feet
ahead.
>By the time the sensors are at the extreme right of the track they will be
pointed
>*left* towards this point. Now you have a bunch of data that describes a
>curved view looking *in* instead of *out*. If you do the run again but
looking at a
>different focal point you will get some more "new" information instead of
just
>the "same old" information over and over. Of course you now have to
>manipulate this data to be able to display an image, but at least now
>your image is richer.
>
>A phased array of sensors arranged in a curve can give you much the same
info.
>In this case the sensors are stationary and activated one after the other to
>generate a "sweep". Tradeoff is more sensors, but greater speed of
>acquisition.
>
>Just a few random ramblings from
>Fr. Tom McGahee

Thanks.  I guess what I really need is a way to defeat the laws of physics
(wavefront propagataion and all).

Matt Bennett