A. Schematic of Heron’s two levels of operation. Left shows a Knowledge Graph made of Nodes with Names, named Parameters and named Input/Output points. Nodes are connected with links (Edges) that define the flow of information. Right shows a piece of textual code that fully defines both the Node’s GUI appearance and the processes run by the machine. B. Heron’s implementation of A, where the Knowledge Graph is generated in Heron’s Node Editor window while the code that each Node needs to be defined and the code that each Node is running is written in a separate Integrated Development Environment (IDE, in this case PyCharm).

Heron’s design principles. A. On the right, Heron’s Node structure showing the two processes that define a running Node and how messages are passed between them. The design of the special message defined as ‘data’ is shown on the left. B. Heron’s processes and message passing diagram. Each rectangle represents a process. The top orange rectangle is the Heron’s Editor process, while the other three green rectangles are the three Forwarders. The Proof of life forwarder deals with messages relating with whether a process has initialised correctly. The parameters forwarder deals with messages that pass the different parameter values from the editor process (the GUI) to the individual Nodes. Finally, the data forwarder deals with the passing of the data messages between the com processes of the Nodes. The squares represent the worker (top) and com (bottom) process of three Nodes (a Source, a Transform and a Sink from left to right). Solid arrows represent message passing between processes using either the PUB SUB or the PUSH PULL type of sockets as defined in the 0MQ protocol. Dashed line arrows represent data passing within a single process.

A. The Knowledge Graph of the Probabilistic Reversal Learning experiment showing 4 Nodes comprising the task, two of which (Trial Generator and Trial Controller) are connected in a loop (the output of one is fed into the input of the other). B. The schematic or mental schema of the experiment. Notice the direct correspondence (i and ii) between the Heron Nodes and the two main components of the experiment’s mental schema as well as the Node’s parameters and the schema components’ basic variables.

The Knowledge Graph of the Fetching the cotton bud experiment. The coloured circles at the bottom middle of each Node and the coloured star and square (where shown) are not part of the Heron GUI but are used in the Figure to indicate in which Machine/OS/Python configuration each Node is running its worker script (circles) and which visualisation image at the bottom corresponds to which Node. For the circle the colour code is: White = PC/Windows 11/Python 3.9, Red = PC/Windows 11/ Python 3.8, Blue = Nvidia Jetson Nano/Ubuntu Linux/Python 3.9, Green = PC/Ubuntu Linux in WSL/Python 3.9. The two smaller windows bellow Heron’s GUI are visualisations created by the two Visualiser Nodes. The right visualisation (coming from Node Visualiser##1) is the output of the Detectron2 algorithm showing how well it is detecting the whole cotton bud (detection box 0) and the cotton bud’s tips (detection boxes 3). The left visualisation box (output of the Visualiser##0 Node) is showing the angle of the cotton bud (in an arbitrary frame of reference where the 0 degrees would be almost horizontal on the screen and the 90 almost vertical). This angle is calculated in the TL Cottonbud Angle Node which is responsible for running the deep learning algorithm and using its inference results calculating the angle of the cotton bud. As shown the TL Cottonbud Angle Node is running on a Linux virtual machine (since Detectron 2 cannot run on Windows machines).

Computer game for rats. A. The experimental design (see Supplementary Fig 4 for a detailed explanation) B. The Heron Knowledge Graph. C. The Unity project used to create the visuals and their reactions to the commands send by the “TL Experiment Phase 2v2##0”, logic Node. Using Unity made dealing with the “gamey” aspect of the experiment a much simpler experience, while the logic and detection of inputs were kept in Python. This way we utilised the two frameworks to their strengths while circumventing their limitations (i.e. writing visual gaming code in Python and interfacing with beam breakers and touch sensitive panels in Unity).