scalable

What Might "Deep Biom*" Look Like?

Michael Helms distinguished between inspirational and deep bio-inspired design (BID) in ZQ25 (https://issuu.com/eggermont/docs/zq_issue_25final01/38).  Inspiration BID can provide practitioners with new ways of looking at problems and has low barriers to entry, but projects often get stalled in the ideation phase.  Deep BID produces a higher yield of implementable solutions but requires significantly more investment in expertise and time, along with approaches tailored to the specific situation.

I recently stumbled on “Description and composition of bio-inspired design patterns: a complete overview” (https://link.springer.com/article/10.1007%2Fs11047-012-9324-y) which describes a set of self-organisation design patterns.  Each low level pattern  is extensively documented and then combined into intermediate and higher level patterns (figure 4 in the paper), similar to the structure of Christopher Alexander’s pattern languages.  The authors included biological examples in their low level patterns and named the high level patterns using biological terms, but the patterns themselves seem to have been developed by studying computer science and robotics implementation.  This is not surprising – our detailed understanding of the mechanics behind self-organisation is in its infancy.  It may be that applying what we have learned from nature will help us define useful design patterns/principles that will accelerate broader practice. 

We can also evaluate how well our application of these design patterns/principles matches the functionality we observe in nature, creating a feedback loop that continually improves the patterns.  Artificial Intelligence has made great strides in pattern matching, but the high cost of training these AI systems and their susceptibility to even minor image tampering suggest that AI systems are a long way from emulating visual recognition in nature works.  “Machine vision that sees things more the way we do is easier for us to understand” (https://www.technologyreview.com/f/614870/ai-machine-vision-interpretable/) explores the challenges and also identifies pathways for improvement.  “Why even a moth’s brain is smarter than an AI” (https://www.technologyreview.com/s/610278/why-even-a-moths-brain-is-smarter-than-an-ai/) asks how moths are able to recognise new odors based on a few exposures while AI requires massive training sets, and describes research in building neural networks based on a deeper understanding of the moth’s olfactory learning system. 

These examples suggest that effective biom* requires more than a one-way transfer of knowledge from biology to technology.  There are a growing number of biom* examples involving people who brought novel expertise to both the biology and the technology, such as Annick Bay (https://issuu.com/eggermont/docs/zq_issue_25final01/90), Rolf Mueller (https://issuu.com/eggermont/docs/zq_issue_08_final/38), and John Dabiri (http://dabirilab.com/fieldlabs).  

Peter Niewiarowski pointed out in ZQ16 (https://issuu.com/eggermont/docs/zqissue16/44) that “It is crucial to create a ‘space’ that encompasses the important knowledge fields and ideally enhances all of them.”  The process of doing biom*, analysing the results, and comparing our best attempts with what nature can accomplish may not only improve our ability to do biom* in a reliable and scalable manner, but also deepen our understanding of nature.

Syndicate content
randomness