Here is the video of the lab where I was doing my PhD. I speak Russian there but it has English captions, that I made myself.
Here is the demonstration of the semester project for ISE 716: Automatic Systems Engineering, completed in 2021. It utilizes an Indexed Line with 2 Machining Stations from FischerTechnik and an Allen-Bradley PLC.
The task of the project was to develop a PLC control program that strictly follows a predefined sequence of operations for an automated manufacturing line. After part detection and transfer between conveyors, the system must perform a 5-second milling operation followed by a 10-second drilling operation, with all movements and stops driven by sensor feedback and timers.
The control logic was also required to handle overload conditions: if a part enters a machining station while another part is still being processed downstream, the entire system must immediately stop, disable normal operation, and activate fault indication for a fixed time before allowing recovery.
This video shows one of my early robotics projects from 2021, created for the ISE Department Open Door Day. It demonstrates the integration of a UR5e collaborative robot with a Tormach PCNC 440 CNC mill. No metal was harmed in the making of this demo - the machine is air-cutting only (no tool installed) for safety and logistics - but you get the idea 🙂
This video shows my first hands-on experiment with digital twins and IIoT, long before I discovered modern simulation platforms like Isaac Sim.
I built a basic digital twin of a VF-2 CNC machine using the Fusion 360 Python API. A Raspberry Pi 4B is connected to the real CNC and publishes live machine data - axis positions and spindle speed - over MQTT. A custom Fusion 360 add-in subscribes to that data and drives the joints of the 3D model in near real time.
Is it laggy? Yes.
Is it rough? Absolutely.
Does it demonstrate the core digital-twin concept? 100%.
This project was about proving the pipeline:
physical machine → live data → virtual model reacting to reality.
Source code:
🔌 MQTT data publisher (CNC → IIoT): https://github.com/pkoprov/Haas-Data-Collection
🧠 Fusion 360 Digital Twin app: https://github.com/pkoprov/VF-2_Digital_Twin
This was an early, unpolished project - but it marked the starting point of my work in digital twins, smart manufacturing, and cyber-physical systems.
This video shows an experiment where a collaborative robot is used as a tool-handling system for a CNC machine.
In simple terms: an expensive tool changer — but with a point.
Instead of buying additional CNC hardware, a Universal Robots cobot is attached to an existing brownfield CNC (a Tormach) to extend its capabilities.
Does it always make economic sense? Not necessarily.
Does it demonstrate what cobots are actually good at? Yes.
Cobots are not just for part loading. They can:
Handle and swap tools
Extend legacy machines without retrofitting
Add flexibility where fixed automation would be expensive or rigid
This is a practical demo of how robotic handling can replace specialized hardware when flexibility matters more than cycle time.
No hype. Just a real lab-style experiment showing what’s possible when you combine cobots with existing machines.
This short demo (42s) shows the workflow I was aiming to build for CSC 591 “Software for Robotics Today” using NVIDIA Isaac Sim: a CNC machine “talking” to a UR10e through a digital twin, with computer vision used to locate stock and guide pick-and-place.
I didn’t have enough time (or the right setup) to fully complete the Isaac Sim + CV integration, but the motion sequence goal is demonstrated here: the UR10e opens the side door on a Haas VF-2, unloads a finished part from an Air Vise, places it on the completed-parts table, grabs new stock from the stock table, loads it into the CNC, returns, and closes the door.
In this video, I show a real CNC machine and collaborative robot sharing a live digital twin.
A Haas VF-2 CNC machine and a UR10e robot are connected to a physics-based digital twin running in NVIDIA Omniverse / Isaac Sim. When the real CNC moves its table, the digital twin moves at the same time. The robot and machine are continuously synced with their virtual versions, not pre-animated.
This setup is more than visualization. The digital twin knows:
where the machine and robot are,
what they are doing right now,
and how they relate to each other inside the cell.
The CNC sends live data using industrial messaging (MQTT Sparkplug B).
The robot syncs through ROS 2 and real-time data exchange.
Both directions work: the real machines update the twin, and the twin can drive the robot.
The long-term goal of this work is smarter CNC tending and material handling, where robots don’t rely on hard-coded waypoints but instead react to what’s happening in the cell. NVIDIA Isaac Cortex is used as the decision layer to enable event-driven behavior and future computer-vision integration.
This video is part of my PhD research on digital twins and adaptive manufacturing for small and medium-sized factories.
In this video, I show how one can "easily" create a model of a simple robot in IsaacSim and then convert it to URDF and visualize in RVIZ
This video shows my presentation at the 53rd SME North American Manufacturing Research Conference (NAMRC 53), where I present our work on applying digital twins and low-code IIoT platforms to real manufacturing systems.
The talk is based on the peer-reviewed paper “Industrial Metaverse Meets IIoT: Low-Code Platforms for Machine-to-Machine and Human-to-Machine Integration.” The focus is practical, not hype: how small and medium-sized manufacturers can build useful digital twins without massive budgets or armies of software engineers.
What is demonstrated and discussed:
A one-way digital twin (digital shadow) of a Haas VF-2 CNC machine for real-time monitoring
A two-way digital twin of a UR10e collaborative robot enabling simulation and real-time control
MQTT Sparkplug B for structured, scalable machine-to-machine communication
NVIDIA Omniverse / Isaac Sim for high-fidelity digital twin visualization and motion generation
Low-code / no-code tools (Node-RED) to enable fast integration and operator-friendly workflows
Alignment with ISO 23247 digital twin standards
The key message: digital twins and the Industrial Metaverse are not just for large enterprises. With the right architecture and low-code tools, SMEs can deploy real, working IIoT systems that support monitoring, automation, and human-in-the-loop control.
UR5e demonstrates precision assembly: the back cap is fixed in a pneumatic vise, the robot places the stator + rotor, mounts the front cap, then does a quick victory wave. After that - reverse mode - clean disassembly step by step. Simple, repeatable, and fully robotic.
Here are the results of 4 groups of students that solved the task I gave them in the course ISE416 Manufacturing Engineering II: Automatic Systems Engineering. They had no robotics background before.
1st task - Tower of Hanoi game with UR5e cobot
2nd task is to write ISE with marker using Fanuc industrial robot