Object- and position detection
Toolbox for robot vision
The software includes not just one detection model, but several. This enables the implementation of a wide variety of grasping and positioning tasks in robotics.
# Machine tending
# Depalletizing
# Assembly
Strong algorithms
State of the art in machine vision
The core of the software enables the detection and localization of rigid objects in 2D and 3D images. In addition to AI, the functions use established standard methods to determine the gripping point.
Robot
KUKA KR120
Camera
Zivid2
Plane Picking
The core functionality of this service is the recognition and localization of one or more pre-trained object classes in point clouds captured with a 3D camera. Here, it is assumed that the objects are resting on one or more planar layers (e.g., a pallet, cardboard separators, or a conveyor belt). The achievable accuracy is sufficient for direct loading of a machine, without additional mechanical centering/alignment aids.
# High precision and reliability
(thanks to underlying machine learning approach)
# No teach-in required
(all training data generated automatically)
# Distinguishes known object classes
(for placing and sorting simultaneously)
# Automatisiertes Anlernen von Werkstücken
(synthetische Trainingsdaten aus CAD Modell)
# Höhere Präzision & Zuverlässigkeit
(kontinuierlich lernendes System)
# Nicht-sortenreine Objekte prozessierbar
(Ladungsträger mit unterschiedlichen Typen)
Robot
KUKA Agilus
CAMERA
Baumer VCXG-51M
Contour Picking
Some processes do not lend themselves to imaging with a 3D sensor, either because the parts in question exihibit unfavorable reflectance, or the relatively long image acquisition time is prohitive (under a given cycle time). For these cases, we offer a service for localizing items on a planar surface in 2D images. The service learns from each image taken, minimizing the influence of ambient light, which is all the more important without the active illumination typically found in 3D sensors.
# High precision and reliability
# Short cycle times
# Automatic & perpetual training
(continuous and unsupervised learning of environmental conditions such as conveyor belt wear or lighting)
# Automatisches Training
(Boostrapping Ansatz ersetzt manuelles Labeln)
# Höhere Präzision & Zuverlässigkeit
(kontinuierlich lernendes System)
# Für schnelle Zykluszeiten geeignet
Robot
Universal Robot
CAMERA
Zivid 2
Calibration
Like any other sensor, in order to maintain sufficient accuracy, cameras must undergo regular calibration to compensate for changing environmental conditions (vibrations, temperature fluctuations, etc.). This is often done with calibration software provided by the camera vendor which is unable to exchange data directly with the robot cell thus making the process error-prone and time-consuming. Our API enables complete integration of all common calibration procedures (including extrinsic, intrinsic and hand-eye calibration) into the automated process, for almost any type of robot and camera.
# Complete integration into automated process
# Works for any camera-robot combination
# Automatischer Ansatz für HAK
# Funktioniert für jedes Roboter/Kamera Setup
Robot
Universal Robot UR5e
CAMERA
Zivid2 M70
Accurate Bin Picking
The precise picking of an object directly from a box of bulk goods (“bin picking”), e.g., for assembly or machine loading, is advantageous for several reasons. There is no need to keep object-specific carriers on hand, no manual loading of the carriers is necessary, and the transport density remains unchanged.
# Direct picking from box/bin
# No specific load carriers needed
# Automatisches Training
(Boostrapping Ansatz ersetzt manuelles Labeln)
# Höhere Präzision & Zuverlässigkeit
(kontinuierlich lernendes System)
# Für schnelle Zykluszeiten geeignet
Haven't found what you're looking for? No problem!
Our object detection models can be used for many different applications — even in combination. For OEM customers, we also integrate our software into your own products so that you can use state-of-the-art machine vision without having to develop it yourself.
Get in touch with us.
Get in touch with us.
Request appointment