Go to Dolphin Inspired Sonar. Modified on

Current Efforts
From Douglas Moreman

October 2017. Have submitted three new provisional (and nearly finished) patent applications. One is for a 3D streaming fish-finder which simplifies the more basic work of 2016 into an idea of what appears to be a competitive product.
Have introduced the word "toron" into the description of the hypothetical biological mechanism of the imaging sonar of toothed whales. Have re-written imaging code to explicitly use the concept of "synchron."

22 March 2016. I have been mostly off-line for more than a year.
I made a discovery that "changed everything." Then I was ill for a few months. The discovery is in the foundations and can be, superficially, ignored. Its most important keyword is "synchron." Toavision can be based upon the detection of synchrons. Each synchron can infer the existence of a toa-event that created a traveling wave thing that impacted all the sensors in an array. The set of impacts generated the synchron. The synchron can be used to infer its generating toa-event.

At the moment, I am adapting a simulations program, Sonic Eye, to:

analyze files created from echoes of real clicks of dolphins and
compute pictures of what a dolphin was clicking on with its sonar -
what the dolphin was "seeing" by means of echoes of its sonic clicks.

The new program concentrates on Feature-Based Passive (FBP) Sonar, with no active sonar remaining in the code. It simulates a fish that, in animation, can "swim" across the screen, and it simulates echoes-of-a-click arriving at echo-sensors the fish. It computes a picture from the echoes of each click - one picture per click.

While adapting existing code to FBP sonar, I changed the grid of the array of sensors from hexagonal to square, to make easier the work of anyone who might build a real array. Perhaps, simply, an existing array of hydrophones can be adapted to feed data into this new program. Though, the quality of the images might be better if the spacing of the sensors is closer to that in the jaw of a dolphin.

Also, I am programming-in a means to test some new methods of computation. Hopefully, this will either produce better pictures or produce pictures faster. The "book-keeping" involved is driving me crazy.

Note of 16 Jan 2015: I have been working on explanations of details of the new methods. The latest name for them is Synchronics. Imagine that a click emmanates from the forehead of a dolpin and echoes of that click arrive at various times T at various sensors M. The set of ordered pairs (M,T) is a synchron. The new methods involve ways of teasing synchrons out of the welter of times-of-arrival of features of the click on the various sensors. Each "sensed synchron," can lead to one point in an image for that one click. The methods seem simple in my head, but are not lending themselves to easy expression in words. It is taking me "forever" to get the next patent application written.


12 July 2013
+++ In the past two weeks, and mostly in the past seven days,
I have experienced two new breakthroughs. One came after long mulling-over how to understand then immitate the uses of the probable multitude of echotriggers in the jaw of a dolphin. The plucking of that "fruit" required a ladder. But the new fruit fell on its own onto my head - a probable solution to this problem: it had seemed to me that the imaging abilities of dolphins was not limited by the apparent limit of the wave length of their clicks to about 0.6 inches.
Hubris suggests that I now have a grasp of all the major aspects of the imaging sonar of dolphins. Years earlier than I had expected.

01 June 2013
Broke-through much clutter today.
Below is the first/best image made by an echoscope whose sensors are in a square grid and which image was computed by the Method of Ridges [2015: incorporated into "Synchronics."]

The colored dots (red, blue, beige) are what would appear on a sonar "telescope." The thick black lines represent the model of a "fish" from which echos were simulated.
The image is far from that of video, but is pretty good for sonar?
My prior experience has prepared me to be stunned that such good information was generated by JUST NINE sensors. And each row (column) is just 0.5 inch from each of its neighbors. Think about that. But, it actually is magnificent.
If the new methods hold up in further testing, we will have confirmed that Feature-Based Passive methods can make images of fish (albethey crude so far) from an array that would fit into the chin of a dolphin.

All of the colored dots are "hit-points" generated by the Ridge Method from simulated echoes from the simulated fish, from just one dolphin-link click. (Dolphins often click several times per second). The black stripes represent the fish which is modeled by a few thin "ribs" -- to reduce time-of-computing, so that I do not have to wait long between experiments. A side view of the actual, simulated fish is down below.

Note of 23 Aug 2013: The Method of Ridges led to the Method of Cells which was enhanced by the Method of Bridges and all were thrown out for the Method of Sheets [2015: all wrapped up into Synchronics]. The images are better now and I am working to improve them. Speed of computation has increased.