Fabrication of the 3D BAC-eye
The design of the 3D BAC-eye follows the anatomical structure of an apposition compound eye (Fig. 1a). Each microlens on the BAC-eye has the same function as the corneal facet lens of a natural eye. The cylindrical post and the silicone-elastomer waveguide function as a crystalline cone and a rhabdom, respectively (Fig. 1b). The internal structure of the artificial eye mimics the function of pigment cells to reduce optical crosstalk. The number of ommatidia in the BAC-eye (522) is comparable with that of bark beetles (Dendroctonus rufipennis, average count: 272; Dendroctonus valens, average count: 372), ants (Temnothorax albipennis, average count: 300 (male) and 171 (queens); Brachyponera chinensis, average count: 168 (worker) (Fig. 1c)), and fruit flies (Drosophila melanogaster, average count: 730)47,48,49. Figure 1d shows a top-view SEM image of the BAC-eye. It has a radius of 2.5 mm and its microlenses are hexagonally and omnidirectionally distributed across the hemispherical dome. The most peripheral ommatidia are oriented at ±85° with respect to the vertical axis, extending the viewing angle of the BAC-eye to 170°.


a Anatomical structure of an arthropod compound eye. b Labelled cross section of a BAC-eye. c SEM image of a compound eye of the Asian needle ant, Brachyponera chinensis. d SEM image of a BAC-eye. e Illustration of the main steps of the fabrication procedure. The BAC-eye is produced in a hemispherical substrate by casting it in a prepared mould. f Image of the 3D-printed mould. g The 3D-printed substrate and a quarter sectional slice of the substrate. h Image of a BAC-eye after release from the mould. i A view showing the flat bottom of the BAC-eye.
The fabrication process and the components used in the fabrication are illustrated in Fig. 1e–i (additional details provided in Supplementary Figs. 1–4). First, a mould with an open hemispherical pit is 3D-printed using a projection micro-stereolithography 3D printer50,51. The surface of the hemispherical mould is patterned with 522 cylindrical microholes, each with a diameter of 180 μm, that are arranged omnidirectionally along the surface of the hemisphere (Fig. 1f and Supplementary Fig. 2). The process of forming a convex lens mould within these cylindrical holes, however, requires precision handling; due to the small size of the microholes and current limitations in the resolution of 3D-printing technology, the curvature cannot be encoded into the mould directly. Therefore, a microfluidic-assisted moulding technique, which leverages surface tension, was used to form a proper concave shape within each microcavity.
The procedure of the microfluidic-assisted moulding is illustrated in Fig. 2a, Supplementary Fig. 1, and Supplementary Note 1. To form the microlens mould, a hemispherical pit with cylindrical microholes is first filled with acrylate resin. The mould is then spun around its central axis at a spin rate of Ω rpm for 4 min. As the mould is spun, a portion of the acrylate resin is ejected from the microholes due to the centrifugal force generated by the spinning process. The amount of resin that remains within each hole is a function of the spin parameters and the location of the hole within the mould. Figure 2b and Supplementary Fig. 5 presents results from numerical simulations that were performed to study the surface profile of the acrylate resin in the microholes at different orientations (polar angle, α, and azimuthal angle, β, as defined in Supplementary Fig. 3) before and after spinning the mould. While spinning, the surface of the liquid-state acrylate resin in the on-axis microhole (α = 0°) becomes a symmetrical parabolic shape, while the resin in the off-axis microholes (α ≠ 0°) gradually inclines towards the outer side of the microhole as the angle α increases. The amount of acrylate resin remaining in the microhole decreases as the spinning speed increases (Supplementary Fig. 5). Moreover, in the off-axis microholes (α ≠ 0°) close to the centre, most of the acrylate resin ascends up the side of the microhole and spills out (Supplementary Fig. 6). In contrast, the tilted microholes on the edge of the hemispherical pit hold more acrylate resin.


a Illustration showing the fabrication of the mould. b The surface profiles of the acrylate resin in the microholes in different orientations during dynamic (while spinning) and static (after spinning) equilibrium. c A comparison between the surface profiles of the acrylate resin on the (top) flat polymer substrate, (middle) in the microhole, and (bottom) after the microlens is demoulded from the microhole. d Contact angle measurements for the acrylate resin on the photosensitive polymer. e Microscope images of the artificial ommatidia. The microlenses in the different orientations and as produced under different spin rates have a uniform curvature. f Images of a single row of the ommatidia along the curved surface of the BAC-eye. Different heights are produced based on the location of the posts and the balance of forces during the fabrication process. g Experimentally measured (data point markers) and simulation results (solid curves) of the post height distribution across the surface of the hemisphere, with respect to rotational rate.
When the spinning stops, the surface tension dominates and deforms the surface of the acrylate resin into a concave shape within all the microholes. More specifically, the radius of curvature of the concave surface can be described by the contact angle, θe, between the three phases under thermodynamic equilibrium, i.e., (R=d/(2,cos ({theta }_{{{{{{rm{e}}}}}}}))) (Fig. 2c). The equilibrium contact angle, θe, is 13.2°, as measured in Fig. 2d for the system used in these experiments. With a uniform curvature achieved within each microhole, the liquid-state acrylate resin within each chamber is then UV cured for 15 min. The convex surface of each ommatidium can be obtained as the complimentary mould of the acrylate resin in the microholes using microfluidic-assisted moulding. To analyse this performance of this replication process, two moulds, each of which consisted of a single row of microholes on the bottom of the hemispherical pit, were prepared by spinning the acrylate resin with speeds of 1500 and 4500 rpm, respectively. Figure 2e shows side-view optical images of the replicated microlenses at different orientations and spinning speeds. The profiles of the microlenses are nearly identical, demonstrating that their shape is independent of their orientation and the rotational speed, which is consistent with the assumption that the surface tension dominates the formation of the concave lenses. The radius of curvature of each microlens is 91.9 ± 0.8 μm (Supplementary Fig. 7), which is in good agreement with the theoretical prediction, R = 92 μm (Supplementary Equation 9 in Supplementary Note 2). As expected, the height of the cylindrical post (hPost) depends on both the rotational speed (Ω) and the orientation of the microholes (α) in the mould as shown in Fig. 2f. This is because a larger spin speed removes more of the acrylate resin and the microhole close to the centre holds less acrylate resin, resulting in a deeper microhole, and subsequently higher complementary posts. Figure 2g shows the height of the post as a function of α at different rotational speeds Ω. The experimental data (markers) agrees well with the calculations (solid curves). Even though the height of each post is different for each ommatidium in the BAC-eye, the curvature of the microlens on each post is the same. This means that all the ommatidia have an identical relative aperture.
After the concave lens mould surfaces are fabricated, a hemisphere that is complementary to the patterned mould is 3D-printed using a UV curable diacrylate polymer (refractive index nPolymer = 1.46) that is mixed with Sudan Black 3 solvent dye (Fig. 1g). The hemisphere consists of 522 hollow pipelines, or tapered channels, which connect the hemispherical surface to the flat base. Additional details about the design and the 3D model of this complementary substrate can be found in Supplementary Note 3, as illustrated in Supplementary Fig. 3 and Supplementary Fig. 4. The hemisphere is then inserted into the mould and the hollow pipelines in the substrate are aligned with the cylindrical microholes in the mould. The empty pipelines and the concave microholes are filled with silicone by immersing the combined system in a room-temperature-vulcanizing (RTV) silicone in a liquid state. After curing for 4 h, the RTV silicone solidifies into an elastomer. Separating the hemisphere from the mould yields a complete BAC-eye, as shown in Fig. 1h. Each ommatidium consists of a microlens with a radius of 90 μm capped on a cylindrical polymer post. These ommatidia are optically connected to the bottom of the BAC-eye through the pathway formed by the silicone-elastomer waveguide (refractive index nSilicone = 1.50) which was formed in the hollow pipeline. The diameters of the silicone-elastomer waveguides gradually narrow down from the ommatidia (dT = 157 μm) to the bottom of the BAC-eye (dB = 100 μm). This design serves to increase the separation between individual sources. The outputs of the waveguides are hexagonally arranged at the flat base of the BAC-eye (Fig. 1i). Since the bottom of the BAC-eye is physically flat and the 3D array of the surface microlenses has been mapped to a regular hexagonal 2D array via these waveguides, this system can be directly matched to any commercial planar image sensor.
Optical characterization of the BAC-eye
In order to ensure that the BAC-eye maintains a high optical fidelity comparable to a natural compound eye, we optimized the performance of the waveguides and hemispherical substrate of the device. The hemispherical element, which serves as the supporting body of the BAC-eye, is comprised of a photosensitive polymer dyed with Sudan Black 3 solvent dye. The optical waveguides that connect the cylindrical posts of the artificial ommatidia and the bottom surface of the BAC-eye are patterned within this photosensitive polymer. The dye is used to absorb any stray light that escapes from the waveguides and hence acts to eliminate optical crosstalk between adjacent waveguides. Details about the optical density of the photosensitive polymer dyed with Sudan Black 3 solvent dye are discussed in Supplementary Note 4. When the concentration is 1500 μg/mL, the photosensitive polymer with thickness of 9.8 μm has optical density of 3 over the entire visible spectrum (Supplementary Figs. 8 and 9). The RTV silicone used for ommatidia and waveguides is transparent in the range of 400–1100 nm (Supplementary Fig. 10).
To further test the transmission properties and optical crosstalk between the waveguides of each ommatidium of the BAC-eye, simulation models of three ommatidia were established based on the actual structure and physical properties of the ommatidium as shown in Fig. 3a, Supplementary Figs. 11 and 12. The collimated light was incident on the microlens of a specific ommatidium. We could observe the light output only from the end of the corresponding waveguide. Moreover, three curved silicone waveguides were fabricated within the photosensitive polymer mixed with 1500 μg/mL solvent dye, as shown in Fig. 3b. The diameter of each silicone waveguide and the separation distance between each optical pathway are each 100 μm, consistent with the dimensions of the BAC-eye. The bend radius and angle of the waveguides are 600 μm and 90°, respectively. A multi-mode fibre with a 450 nm light source was connected to the middle waveguide (Channel 2). Figure 3c shows an image from the outputs of the three waveguides when only Channel 2 is illuminated. We could not visibly detect any light from Channel 1 or Channel 3, which was consistent with the light distribution measured from the proximal ends of the waveguides (Fig. 3d). The extinction ratios between Channel 2 and Channel 1 and between Channel 2 and Channel 3 are 16.1 and 15.2 dB, respectively. These results are consistent with the optical density measurements from Supplementary Fig. 8, and confirm that the photosensitive polymer mixed with the solvent dye can eliminate any optical crosstalk between the waveguides.


a Simulation of optical crosstalk among the three ommatidia with orientations of (α = 64.8°, β = 0°), (α = 72°, β = 0°), and (α = 79.2°, β = 0°) and intensity distributions at the proximal ends of the waveguides when the light is incident on the middle ommatidium. b A three-channel model to measure the crosstalk among the curved silicone waveguides. c Image captured at the output panel showing the light intensity at the proximal ends of the three silicone waveguides. d The measured light distribution at the output panel. e–h Simulation of ray tracing in the ommatidia with orientations of (e, f) (α = 0°, β = 0°) and (g, h) (α = 36°, β = 0°) and light intensity distributions at the proximal ends of the waveguides when the light is incident from different angles. i, j The angular sensitivity function of the ommatidia with orientations of (α = 0°, β = 0°) and (α = 36°, β = 0°), respectively. Red dots: the normalized intensity obtained from experimental measurements. Grey surface: Gaussian-fitting surface.
We also analysed the coupling and propagation of light within each ommatidium of the BAC-eye using a ray tracing method. In this simulation, collimated light, which is incident on the microlens of the ommatidium with an incident angle (polar angle α′ and azimuthal angle β′, as defined in Supplementary Fig. 13), is coupled into the waveguide. We performed these simulations with a straight waveguide and two curved waveguides with orientations of (α = 36°, β = 0°) and (α = 79.2°, β = 0°), respectively, to mimic the curvature of the optical pathways within the BAC-eye (Fig. 3e–h and Supplementary Figs. 14–16). Since the refractive index of the substrate is lower than that of the waveguide, total internal reflection ensures the propagation of the light inside the waveguide. Owing to the oblique incidence and the non-axisymmetric multiple reflection inside the curved waveguide, the distribution of the rays deviates from the centre at the proximal end of the waveguide. In spite of that, the simulation confirms that the light can be well confined in the ommatidia and efficiently transmit to the base of the eye. In addition, based on the experimental measurement, we found that the optical loss, including the coupling loss into/out of the ommatidium and the transmission loss, is 5.37 dB (details about the measurement of optical loss can be found in Supplementary Note 5 and Supplementary Fig. 17). The loss is attributed to the bending and narrowing of the waveguide.
In addition to the light propagation inside the ommatidia, the 3D nature of this system means that light enters each ommatidium at different angles; therefore, the angular sensitivity of the ommatidia was investigated. In the simulation, the light intensity distribution at the proximal ends of the waveguide was analysed and transmittance, i.e., the ratio between the integral of the output intensity at the proximal end and the incident intensity on the microlens, was calculated (Supplementary Figs. 18 and 19). Slight bending loss was observed in the highly-curved waveguides. It is worth noting that the substrate can absorb the leakage of the light and the optical crosstalk between adjacent optical pathways can be efficiently avoided. Furthermore, the angular sensitivity function of three ommatidia at three different orientations of (α = 0°, β = 0°), (α = 36°, β = 0°), and (α = 79.2°, β = 0°) was experimentally measured, respectively. Supplementary Fig. 20 schematically shows this experiment setup, where a collimated light beam illuminates the surface of the BAC-eye, and the angular sensitivity function of each ommatidium (α = 0°, β = 0°), (α = 36°, β = 0°) and (α = 79.2°, β = 0°) is obtained by measuring the transmitted light intensity from each ommatidium as a function of the incident angle of the collimated light beam (details about the experimental measurement and the simulation of angular sensitivity can be found in Supplementary Note 6). The incident collimated light beam can be rotated around the BAC-eye at any angle (α′ or β′) as defined in Supplementary Fig. 20. Figure 3i and j and Supplementary Fig. 21 show the angular sensitivity function of the three ommatidia with the orientations of (α = 0°, β = 0°), (α = 36°, β = 0°), and (α = 79.2°, β = 0°). The light intensity is normalized to the maximum value measured at the central ommatidium (α = 0°). The plotted red points were obtained from experimental data, while the surface is a Gaussian fit of the experimental data. The ommatidia with the orientations of (α = 0°, β = 0°), (α = 36°, β = 0°), and (α = 79.2°, β = 0°) have the highest intensity at incident angles of (α′ = 0°, β′ = 0°), (α′ = 12°, β′ = 180°), and (α′ = 30°, β′ = 180°), respectively. The acceptance angle of each ommatidium, which is defined as the full width at half maximum of the angular sensitivity function, is about 44°. The wide acceptance angle is attributed to the large diameter of the waveguide, where a large number of propagating modes are allowed52. These experiments suggest that light collected by each ommatidium is efficiently transmitted to the bottom surface of the BAC-eye and can be directly detected by a planar image sensor regardless of the incident angle relative to the artificial eye.
Panoramic imaging using the BAC-eye
In contrast to conventional macro imaging lenses, the BAC-eye is capable of forming wide-angle panoramic images. Figure 4a shows the working principle for capturing panoramic images by coupling a BAC-eye to a planar complementary metal–oxide–semiconductor (CMOS) camera. Light emitted or reflected from an object, such as the red or blue regular tetrahedron in Fig. 4a, is captured by each ommatidium and guided to the bottom of the BAC-eye where its image is recorded by a colour camera. The sub-image on the camera that corresponds to the light from each ommatidium is then homogenized by taking the average value of the light from each ommatidium. This averaging is needed because each ommatidium projects its light across ~80 × 80 pixels of the planar imaging sensor. Finally, a panoramic image of the object is generated on a hemisphere by digitally stitching the images from each ommatidium together while accounting for the orientation of each ommatidium on the outer surface of the BAC-eye (details in ‘Methods’ and Supplementary Fig. 22). The resolution of the BAC-eye is dependent upon the total number of ommatidia.


a Workflow for image acquisition and processing by the BAC-eye. The colour scale represents the intensity level of the detected signals. b Schematic diagram of the experimental setup of the imaging system. c The square mask used for image detection experiments. d The illuminated square pattern as detected by the BAC-eye. The image of the illuminated pattern on the artificial ommatidia is captured by a single-lens reflex (SLR) camera equipped with a macro lens. e The square pattern image captured from the bottom of BAC-eye. f The digitally reconstructed hemispherical image. g Illustration of the hemispherical imaging for a red cross pattern whose centre is fixed at an angular position of (α = 60°, β = 0°) and a blue triangular pattern moving from the side (α = 85°, β = 180°) toward the red cross. h–j The reconstructed images showing the triangle as it travels from (α = 60°, β = 180°), (α = 40°, β = 180°), and (α = 20°, β = 180°). The digitally generated callouts provide stereoscopic vision (for a human observer) of the hemispherical images.
Figure 4b shows the panoramic imaging of a square as visualized through the BAC-eye. Details of the panoramic imaging system are given in ‘Methods’. A mask with a square object which is 300 μm in width (Fig. 4c) was placed in front of the BAC-eye to project a square onto the BAC-eye. Figure 4d shows a side-view optical image of the projected light on the BAC-eye, as taken using another digital single-lens reflex camera equipped with a macro lens at an angle of α = 15° and β = 180°. A telecentric lens was used to magnify the image at the flat base of the BAC-eye and project it onto the CMOS camera. The telecentric lens is not necessary in practical applications, and the BAC-eye can be directly attached to an image sensor; the telecentric lens was used in this experiment solely to magnify the image and improve spatial sampling. Figure 4e and Supplementary Fig. 23a show the image of the square on the camera. The corresponding panoramic 3D image of the square slot is shown in Fig. 4f.
We also demonstrated that the BAC-eye can image objects at different angular positions with a visible angle ranging of 170°. Figure 4g schematically shows the experimental setup for imaging of two objects at different angular positions. A red cross with a line width of 300 μm was placed at a fixed angular position of α = 60° and β = 0° (the centre position of the cross), while a blue triangle with a line width of 200 μm was moved from the side toward the red cross. The corresponding panoramic views for the centre position of the blue triangle at (α = 60°, β = 180°) (Fig. 4h), (α = 40°, β = 180°) (Fig. 4i), and (α = 20°, β = 180°) (Fig. 4j) are reconstructed. The patterns are clearly recognized and the triangle at different angular positions is imaged with a high uniformity in size and shape. The image detected by the CMOS camera for the triangle centred at an angular position of α = 20° and β = 180° is illustrated in Supplementary Fig. 23b. In this demonstration, since coherent monochromatic lasers were used as the illumination sources, interference of the different portions of the incident light occurred; therefore, granular speckle patterns can be observed in the sub-images of the ommatidia. In nature, arthropods can quickly detect and escape from predators and track prey, all based on the information, e.g., position, direction and speed of motion, provided by their peripheral vision. The advantages of wide-angled, motion-based sensing in applications range from macro surveillance functions to navigational functions in endoscopic surgeries.
3D point-source tracking with the BAC-eye
The natural compound eyes of fruit flies and worker bees have poor resolution with respect to static images, but they are highly sensitive to 3D motion detection. Similarly, the BAC-eye can be used to track the position of objects in three dimensions. In contrast to conventional monocular vision systems, compound eyes have the visual advantage of depth perception. Supplementary Fig. 24 illustrates a scenario where objects at different distances are detected using imaging systems equipped with a conventional fisheye lens and the BAC-eye, respectively. The images captured by the fisheye lens are identical, indicating that it cannot distinguish between the absolute distances of the objects. In contrast, it is feasible for the BAC-eye to determine the object distance. In this section, we demonstrate the 3D position tracking of a green light point source with a BAC-eye, as shown in Fig. 5a. The diverging green light emitted from an optical fibre is captured by a BAC-eye and projected onto a CMOS camera. The light spots projected onto the CMOS camera through the BAC-eye (Fig. 5c–h) change in position and diameter depending on the angular position of the point source and the distance between the point source and the centre point of the BAC-eye. The light spot on the CMOS camera can be fitted neatly with a Gaussian function (Fig. 5c–e). Figure 5f–h shows the corresponding panoramic views of the point source at three distances. When the point light source moves away from the BAC-eye, the illumination area incident on the compound eye becomes large and the diameter of the light spot image on the camera increases, and vice versa (Supplementary Fig. 25). Therefore, the centre position and the width of the imaged light spot on the CMOS camera can be calibrated to obtain the 3D position of the point source as it moves. Figure 5i shows the calibration curve between the distance and the width of the light spot on the camera. The angular position of the point light source can be determined from the centre position of the light spot on the camera (see the ‘Methods’). Figure 5b shows the 3D positioning of the point light source at different positions. For the calibration process, the yellow and green solid data points show the actual position and measured position of the light point source, respectively. The yellow and green circles show the actual position and measured position of a point source which had an unknown position a priori. The light distribution and the reconstructed images from the nominally-unknown point source are shown in Fig. 5j, k and Supplementary Fig. 26. The measured positions are consistent with the actual positions of the point source with a root-mean-square deviation of <0.16. The precision of the position tracking can be further improved by increasing the number of ommatidia of the BAC-eye and the bit depth of the CMOS camera. This 3D position tracking feature of the BAC-eye allows it to quantitatively locate a moving light source, which could be potentially implemented for advanced 3D phototaxic navigation and search applications, e.g., as a sensor to guide a robotic capsule endoscope to locate fluorophore-labelled lesions.


a Schematic diagram of the light point source tracking experiments. b The positions of the light spots. The yellow and green solid dots are the target and measured positions used for calibration, respectively. The yellow and green circles are the target and measured positions from experiments where the light location is not known a priori. c–e The light distribution collected by the BAC-eye and f–h the corresponding hemispherical images of the light spots at a radial distance 5, 7, and 9 mm away from the original point. Yellow dots are the average grayscale measured from the proximal ends of the waveguides. The colour scale represents the intensity level of the detected light. i The relation between the full width at half maximum (FWHM) of the light distribution obtained by the BAC-eye and the distance from the original point to the light spot. j The light distribution obtained for the light spot at a nominally-unknown position and k the corresponding hemispherical images of the light spot.

