Russia: TMH Smart Systems has another invention for automated train operation and machine vision. The AI-based solution has been tested on shunting locomotives and improved. The next logical step is to test it on the Ivolga EG2Tv electric train.
What was before the development
This is an idea that TMH Smart Systems has been thinking about for the past five years, when the company joined the development of a hybrid shunting locomotive with a remote control function. Computer vision is exactly the kind of technology that this function couldn’t do without. Obviously, the system has to monitor and display everything that’s going on, but it also has to identify obstacles along the way.
Later, TMH adopted computer vision for shunting locomotives, produced for large industrial companies. The technology was tried and tested in real production, and it did not fail, showing its true potential.
By the time the technology was ready, TMH’d already mastered the production of the Ivolga EG2Tv EMUs. It was only logical, albeit challenging, to adapt the technology to the new trains: this was not a task easily done because a specific train had specific requirements for any addons.
Shunting locomotives tend to move at low speeds, so they have a shorter stopping distance. A locomotive needs 50 to 100 metres to see an obstacle and hit the emergency brake, but it’s an entirely different matter for an electric train. the figure would be much, much higher. The development team knew it was in for months and months of research and development, not to mention extensive testing of rolling stock equipped with cameras, sensors and , calculators. The team needed to collect data and test the technology in different situations.
One of the most popular and well-known applications for computer vision is transport, particularly in the automotive industry. It’s a big market, shorter stopping distances, and generally shorter travel times, especially if compared to a train. Basically, it’s the same technology for automotive vehicles and locomotives, but some things are fundamentally different and challenging when it comes to railways, so the development and adoption are challenging, too. Engineers have to factor in stopping distances of hundreds of metres, complex specific infrastructure, human signs and signals, complicated data collection and real-world run-in, harsh operating conditions, and longer times between maintenance checks.
Developing the idea into solution
The new technology brings together every single solution that contributes to automatic rolling stock control systems. In simple English, with this technology, a train can move unmanned, with no human involvement. The system is supposed to learn a lot of skills to detect obstacles correctly, and researchers spend a lot of time teaching it the basics to:
- recognise and categorise people into two groups: pedestrians and railway workers.
- identify railway infrastructure: railway tracks, other trains, traffic lights recognising their colours, platforms, bollards, shackles;
- understand whether traffic lights and track switches are relevant to its track;
- identify civilian infrastructure objects: cars, bicycles, etc.;
- locate the train and measure the distance to the nearest important locations: platforms, railway stations, etc.;
- build a loading gauge, i.e., the maximum size of rolling stock in relation to whatever it passes through. It sets limits to the extent to which switches, traffic lights can encroach on a train, as well as defines the so-called danger zones that are not supposed to have any obstacles. The system calculates the coordinates of zones like these, so if anything comes into view in this zone, then the train reacts accordingly, based on what exactly it reads from that alien object.
The engineers used neural networks to develop the system and teach it to identify everything. It took a lot of images to train the system.
Lower sensor unit above the coupler of the Ivolga. Source: TMH
“We used tonnes and tonnes of data. Multiple cameras, different resolutions, varying optical parameters, good weather, bad weather, camera angles—nothing was left out. We wanted to account for any situation possible, so we even used some images generated during tests. We even used stereoscopic vision. Two cameras worked together to create one image at the same time, so the system was able to pinpoint the distance towards an obstacle. That was not the only technology we used. Lidars came in handy, too. High speeds, however, certainly made things difficult for us. We picked sensors, lidars, cameras (and we couldn’t have more or fewer; it had to be something in between, the balance, if you will), and then we synchronised them. Not at the same time, though—it would’ve been too easy and had no effect. The goal was to turn them on one by one, in a sequence. We came up with a sophisticated synchronisation and processing algorithm for that”, adds Mr Cherkasov, lead on-board system designer at TMH Smart Systems.
While they were at it, they designed a passenger boarding/unboarding control system. It is a job done by an assistant train driver. Once they see no more people get on/off, they give a signal to the train driver, and off the train runs. Now there’s no more human factor to consider and no mistakes to make: video cameras monitor the situation and, once they’re sure there are no people in the way, send a signal to the train operator’s panel. Trains will leave faster, too, a good thing to have since wait times are getting shorter and shorter.
Safety is a separate matter, one that the company does not take lightly. We can’t use the same safety criteria for AI technology that we have used for years. They require something totally new and different. Of course, the company uses safety criteria and requirements that we all know, but they’re more of a skeleton than a system that can be truly used. The skeleton brings together basic software and hardware, and then they build upon that skeleton. What is the best way to measure the performance of technology adopting human senses? Compare it to a real human’s performance, of course. That is exactly what the company does: it digitises human performance and then tests the system, either at real testing grounds or virtually.
Details about the technology
Technically-wise, an obstacle detection system features a set of equipment for an electric train, carefully selected and synchronised. It is made possible thanks to different cameras featuring high-quality sensors, different focal lengths and panoramic views. They can track absolutely anything: a wide shot, short range, there is nothing the system can’t do. It also requires a stereoscopic pair. As we mentioned before, two cameras act like human eyes and build a depth map.
Lux meters control for the brightness to help the system adjust accordingly. Sometimes cameras are not fast enough to adapt to a sudden change of light, like when the train leaves a tunnel. In this case, lidars take over. They use lasers and radars, so the light doesn’t bother it anyway. Lidars, however, are very sensitive to precipitation and fog, so some weather might cause them to make mistakes. TMH Smart Systems’ experts wanted all pieces of the system to help and complement each other, so they created elaborate algorithms. There is more than one unit used to determine a train’s location, and they are all integrated: GNSS, inertial system, SLAM technology and special maps featuring important locations the system found en route.
Cameras in the driver’s cab of the Ivolga. Source: TMH
Some technology and tools didn’t make the cut. For example, thermal imaging cameras identify people, animals, cars, and other heat sources in the dark, but they can’t measure the exact distance to these sources, nor can they be connected to other devices to get fuller data. Thermal imaging cameras are also very expensive and hard to find, so they’re a nogo—at least for now.
“Sensors generate data in real time. Then the processor receives the data, analyses it, and combines it, resulting in an accurate picture, expressed in mathematical terms. Then our neural network algorithms come into action as they identify classes, types of the object, and the like. Decision-making algorithms then analyse the results, identify the object, the distance, their location in relation to the train, anything to help them decide and move the train accordingly”, says Konstantin Shutilov, head of the company’s computer vision department.
The location of all add-ons is a tough nut to crack. Engineers and designers scratch their heads thinking where to place another new technology and make sure it does not interfere with the old software and is still fully functioning.
“Modern trains are already overstuffed. New technology is good and everything but deciding where to put it is a real pain in the neck. It gives us a headache just to think about it, and it’s getting challenging with each new train. So, what are we to do? We place new devices everywhere we can put our hands on: walls, false panels, under the ceiling, under the car. It’s like packing a suitcase. No empty space left! Anything goes!” says Mr Cherkasov.
Testing of the system
The engineers knew the system was effective, but it needed testing. The Moscow-Usovo line volunteered to try the new technology for the Ivolga, and it was a roaring success. The system ran at a frequency of five cycles per second and visualised the results twice per second. Everything the developers wanted was coming to life right before their eyes. As the train goes through a cycle, it finds and identifies objects that are 600 or fewer metres away from the sensors, and then makes the decisions accordingly. Viewing angles start at 12 degrees and go up to 110 degrees depending on the scanning range. The difference is striking: 12 degree viewing angles for the furthest zones of up to 600 metres, and 110 degrees for near zones of up to 150 metres.
Testing the computer vision system (speed up). Source: TMH Smart Systems
The obstacle detection system will be integrated with other train control systems, so it can respond to obstacles adequately regardless of external factors. The company is also looking to double down on accuracy algorithms so the system can build a better loading gauge. Besides, it needs to learn to remove irrelevant parts of the field of vision and improve the accuracy of the stereoscopic pair. The distance error is supposed to be less than 10%, no matter the distance.
“We are doing R&D on the Ivolga. Once we’re finished, we’ll get a sample car to mass produce. I wager it’ll take us two years to get an electric train like that, with the automation system and everything, potentially mass distributed. But that’s up to the customer. We hope that our technology will be in demand in other areas as well. Metro systems can use it for unmanned control. The industrial transport sector has been asking about it, too. It’s really versatile. Mark my words: another two or four years, and companies will have a field day using the system. There are dozens of locomotives featuring this system now, can you imagine how many there’ll be in two years? I bet it’s going to be hundreds,” says Mr Cherkasov.
Based on the Eye of the Tiger, TMH vektor magazine 2 (57) 2024