A provider of cleaning technology solutions turned to Altoros to optimize its proprietary contamination detection algorithm and convert it to a full-scale iOS-based software development kit (SDK).
Brief results of the collaboration:
- By migrating a proprietary infection detection algorithm to a modern technology stack and extending its functionality, the company was able to white-label the solution and offer it as a separate SDK or embedded into the existing product.
- With a proof of concept delivered in just 2 months, the provider validated the SDK’s feasibility and presented it to a few university campuses, onboarding them as clients.
- Thanks to the introduced optimizations, the SDK can now assess the efficiency of facility cleaning/disinfection in just 3 seconds instead of 2 minutes.
Based in Canada, the customer provides solutions to audit cleaning/disinfection procedures in high-traffic areas. In 2021, the company received a US$4.5-million grant to develop a system that detects microbial contamination on surfaces within facilities.
The customer’s offering comprised a tripod, a lamp with ultraviolet (UV) and LED bulbs, a biotracker spray, and an iPad mini app. The cleaning staff would mount an iPad mini on a tripod, attach and switch on the lamp, and use the iPad’s camera to take 10 photos of the surface. After cleaning, a user would apply the spray and take another 10 photos of the area. Lying at the core of the app, the proprietary algorithm would compare 2 sets of photos and score the cleaning efficiency on a 100-scale. However, the algorithm utilized an outdated technology stack incompatible with newer iPad mini models. In addition, a slightest movement of the device affected image quality and assessment precision.
With Altoros, the customer wanted to optimize the algorithm, extend its functionality, and convert it to a mature SDK that could be white-labeled and offered both together and separately from the existing system.
Under the project, the team at Altoros had to address the following issues:
- Performance limitations of iPad’s central processing unit (CPU) slowed down parallel analysis of 20 photos to 2 minutes.
- A small memory capacity in iPad mini prevented the algorithm from analyzing images in the original size, resulting in app crashes.
- For quality images of the surface, it was crucial to switch 16 ultraviolet and 16 LED bulbs on a lamp in a particular sequence. If done manually, this would involve multiple steps highly prone to human error.
- The hue parameter—used to highlight contamination in images—was hard-coded in the proprietary algorithm. It was impossible to customize the parameter values, which was key to achieving better microbe detection precision.
Stage 1. After analyzing the customer’s requirements, engineers at Altoros conducted a comparative research of the technologies to migrate all the existing features of the proprietary algorithm to a new stack without any incompatibilities. Swift was chosen as a native language for iOS-based SDKs and apps, while Metal Shading Language offered advanced graphics processing unit (GPU) tooling.
Stage 2. To enhance the analysis of 20 photos, the developers enabled parallel aggregation on both GPU and CPU. By lowering CPU overhead, the team at Altoros decreased the processing time from 120 to just 3 seconds.
Stage 3. To prevent crashes, the engineers made it possible to automatically scale the images down to an optimal size once memory limits were exceeded.
Stage 4. By integrating Core Bluetooth into the SDK, developers at Altoros were able to send execution commands to the lamp, which specified the sequence of switching on/between UV and LED bulbs.
Stage 5. With the proper choice of programming languages, the team was able to rewrite the hue parameter with customizable values, thus improving precision when highlighting contaminated areas in the images.
Stage 6. Using AVFoundation, engineers at Altoros made it possible to autoconfigure camera parameters (focus, ISO, white balance, etc.) and apply them to all the photos for improved quality. Then, the developers utilized the OpenCV library to detect camera motion and compensate displacement (up to 10 pixels).
Stage 7. Following best practices (creating reusable persistent objects, setting appropriate resource storage, etc.), the team converted the proprietary algorithm into a fully functional SDK in just 2 months. Finally, engineers at Altoros built a test iOS app to validate the feasibility of the delivered SDK.
of funding raised
on image processing
on SDK development
Partnering with Altoros, the customer extended the functionality of its proprietary contamination detection algorithm and converted it into a white-label software development kit in just 2 months. With the delivered proof of concept, the company was able to validate the SDK’s feasibility, present the technology to a few university campuses, and onboard them as clients. Thanks to the introduced optimizations, the SDK is now capable of assessing the efficiency of cleaning/disinfection in just 3 seconds instead of 120.
Together with Altoros, the company is implementing further improvements—e.g., using aruco markers to apply 3D transformation for motion compensation—to remove a tripod from its offering for the sake of convenience.
Swift, Metal Shading Language
Frameworks and tools
AVFoundation, Core Bluetooth, OpenCV