Skip navigation links

Oct. 24, 2022

Advancing autonomous vehicles to usher in a safer future

Michigan State is leading innovations in technology that will evolve our transportation ecosystem

Michigan State University is home to nearly 50 experts, from a variety of research fields, shaping the future of mobility.

That’s because improving transportation — making it safer, smarter and more accessible no matter how people choose to get around — isn’t a matter of solving one problem. It requires embracing complexity and working to evolve an entire ecosystem.

Autonomous vehicles present a microcosm of that ecosystem. Building the technology that will make these cars and trucks more connected, more trustworthy and more capable of driving themselves requires an array of talents and know-how.

That’s one of the reasons why MSU is a premier place to get an early look at the future of autonomous vehicles.

“There are so many different areas of expertise you need to pursue this technology,” says Hayder Radha, an MSU Foundation Professor of electrical and computer engineering. “We have it all in one place.”

Beyond this breadth of expertise, Michigan State also has a campus that allows researchers to study these vehicles in a variety of settings that bear the brunt of all four seasons. All of this makes MSU a unique proving ground that will help autonomous vehicles best integrate into the mobility ecosystem.

At MSU, researchers produce, gather and analyze incredibly valuable data with different autonomous platforms on campus, including a first-of-its-kind autonomous bus. They ask brand new questions about how these vehicles will interact with people and infrastructure. They also help ensure a vehicle’s many sensors work synergistically to make sure the vehicle operates as safely as possible, no matter how challenging the conditions.

“The immediate impact of what we’re doing is really achieving a high level of safety,” says Radha, who is also the director and faculty coordinator of MSU’s Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS.

“First and foremost, autonomous vehicles will be saving lives,” he says.

This example of multimodal sensor fusion from the Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, at MSU shows a camera recording on top and lidar readings below. The color in the lidar screen indicates how near or far an object is.
Credit: Courtesy of Daniel Morris/Xiaoming Liu/CANVAS

A new level of synergy

Underlying this assertion is the fact that roughly 90% of fatal roadway collisions are the result of human error, Radha says.

MSU Foundation Professor Hayder Radha (right) and doctoral student Daniel Kent (left) work in the Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, lab. Credit: Michigan State University

The promise of autonomous vehicles is that they will prevent those errors by taking human decision-making out of the equation. The rub is that removing that component would also mean cutting out things that humans do really well to sense and react to their environments.

Yet technology can make up for that, as evidenced by the role autonomy has in our lives today.

For people in California or the southwestern U.S., driverless vehicles might already be serving as taxis and delivery trucks. More globally, though, autonomous features that adjust cruise control speeds, keep drivers from drifting out of their lanes and monitor blind spots to prevent collisions are becoming more commonplace.

Increasingly affordable sensors and sophisticated software have enabled researchers to replicate what humans do. In some cases, these systems can outperform humans.

“Humans can really only focus on one thing at a time,” says Daniel Morris, a colleague of Radha’s in CANVAS and an associate professor in the Department of Electrical and Computer Engineering. He also has a joint appointment in Biosystems and Agricultural Engineering.

“An autonomous vehicle will have sensors looking in all directions at all times. And they can’t get distracted. That right there has a great potential for safety,” Morris says. “We can also combine more modalities to improve the sensing. People aren’t so good at seeing through fog, rain and snow. But we have radar, thermal infrared cameras and some lidar that can do those things.”

MSU Associate Professor Daniel Morris presents at a 2019 event in the Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, lab space. Credit: Courtesy of MSU College of Engineering

No one sensor is perfect on its own and one of the strengths of the CANVAS team is in what’s known as multimodal sensor fusion. That is, CANVAS researchers are developing algorithms to enable autonomous vehicles to better and more seamlessly fuse data captured by different devices.

For example, video cameras take images of a 3D world and condense them into two dimensions. But autonomous vehicles need to account for all three.

Light detection and ranging, or lidar, technology can determine how far away objects are by shining laser light on them and detecting how long it takes that light to return. Combining cameras with lidar thus gives a vehicle vision with depth perception.

Another example that Morris’s team is currently working on combines radar and video.

“Radar is already used in some cars right now for collision avoidance. By measuring distance to objects as well as incoming velocity, a radar-based system can predict an impending collision and automatically activate brakes,” Morris says.

So, radar is incredibly useful, but it’s not equally useful in all directions. Imagine a car traveling into a four-way intersection from the south with a radar attached to its front like a hood ornament. The radar is going to detect and predict vehicle motion coming from the north better than it would a vehicle traveling east to west.

“Radar has a weakness with cross traffic,” Morris says. “The question became could we upgrade radar for cross traffic.”

By combining radar information with video data, Morris and his team showed it’s possible to capture the full velocity profile — front to back, side to side — of other vehicles on the road.

Both projects illustrate the advantages of being able to combine multiple sensor inputs, as well as the advantages of MSU’s researchers having connections with the automotive industry. Morris’s project with lidar was supported by Changan Automobile and his recent radar work was inspired by Ford Motor Company.

“I wasn’t thinking about radar and camera fusion until we talked to Ford. They had seen what we did with lidar and asked if we could do it for radar,” Morris says.

In fact, Ford has been supporting the work of CANVAS since the team was organized under Radha’s leadership about five years ago. And, further reflecting the interdisciplinary spirit of this field, it's not just automotive companies who are getting behind the research.

For example, the Semiconductor Research Corporation — a consortium that includes companies such as Intel, Samsung and Texas Instruments — is also supporting the multimodal sensor fusion efforts of CANVAS.

“It’s been great working with partners in industry. They want the technology and they’re eager for results, and they help us with research directions based on what they’ll need a couple years down the line,” Morris says.

This example of multimodal sensor fusion from the Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, at MSU shows the fusion of camera and radar data. This enables researchers to measure the speed and direction (green lines) of other vehicles (blue outlines). Credit: Courtesy of Daniel Morris/Xiaoming Liu/CANVAS

‘Safe and cohesive’

Both projects also underscore one of the most important and easiest-to-overlook issues when it comes to autonomous vehicles, says Daniel Kent, a doctoral student working in Radha’s group.

“You need to use a lot of care bringing things together in a way that is both safe and cohesive,” Kent says.

He’s standing behind a bright green Chevrolet Bolt, an electric car, that’s adorned with six cameras, six radar devices, two lidar sensors and a precision GPS system. On the opposite side of the garage in the CANVAS lab space sits a hybrid Lincoln MKZ with a similar sensor rig. Both cars are platforms to test and develop the team’s hardware and software architecture.

Michigan State’s autonomous Chevrolet Bolt goes for a spin. Credit: Courtesy of the Student Organized Autonomy Research Group

The Bolt’s trunk is open, revealing a computer system that takes up all the floor space with cables, circuit boards, power supplies and a liquid cooling system. A nearby screen reveals a glimpse of the code that enables the car’s sensors and systems to communicate with each other and the outside world.

To Kent, that software component is the most compelling challenge of autonomous vehicle development.

“Adding another radar, for example, wouldn’t be the biggest deal,” Kent says. “It’s getting that then to interact and communicate with everything else.”

Although students can study robotics and systems engineering in the classroom to appreciate such challenges of autonomy, Kent says, there’s no substitute for hands-on experience when it comes to vehicles.

And he would know. After earning his bachelor’s degree in electrical engineering from the University of Michigan, he spent some time in industry before deciding to return to school and earn his doctorate researching autonomous vehicles. He enrolled at MSU because he saw it as the best fit for his skill set.

A photograph of the Student Organized Autonomy Research Group, or SOAR, and other MSU autonomous vehicle researchers. Associate Professor Daniel Morris is at the far left. Credit: Courtesy of SOAR

“MSU has the program, it has the equipment and it has the expertise,” Kent says.

It’s not just graduate students who benefit, either. There’s the Student Organized Autonomy Research Group, or SOAR, whose membership is open to undergraduates.

SOAR was originally formed with the goal of developing a fleet of autonomous golf carts, but then its members learned about CANVAS and set their sights on a larger vehicle. As a result, that computer system occupying the trunk of the Chevy Bolt was built almost completely by students.

That means MSU isn’t just helping its industrial partners by inventing technical solutions; the university also is preparing the next generation of talent who will work for carmakers and other industries pushing the envelope of automation.

In addition to the students and their faculty mentors who are helping shape the future of autonomy, the campus itself has gotten in on the action.

Here’s a look at MSU’s autonomous Chevrolet Bolt and some of the technology it has on board. Credit: Courtesy of the Student Organized Autonomy Research Group

A 'one-of-a-kind environment'

It was in the past five years that MSU formally made it a priority to mold itself into a testbed for mobility research. Unofficially, the East Lansing campus grew into that role organically over its lifetime.

Even without an emphasis on mobility, MSU’s campus would still have more than 5,000 acres featuring a mix of urban, suburban, rural and industrial zones. It would still have nearly 60 lane miles of roadway, 20 miles of bike paths and more than 100 miles of pedestrian walkways winding through its more than 20,000 trees. Students, staff and faculty would still get around by car, bus, bike, scooter, skateboard and even the occasional rollerblade.

MSU’s autonomous vehicle passes through the same intersection in three seasons: summer (top), winter (bottom left) and spring (bottom right). Copyright Institute of Electrical and Electronics Engineers, 2018 IEEE 88th Vehicular Technology Conference

But when all that happens at the home to world leaders in mobility research, the campus truly becomes a “one-of-a-kind environment to study all areas of mobility advancement,” as stated by the MSU Mobility website.

In the context of autonomous vehicle research, that advancement requires Kent and his colleagues to take the Bolt, the MKZ and their autonomous systems over the same stretches of road over and over and over again. This allows the team to record and analyze how the systems see and process the same road in different conditions at different times of day and at different times of the year.

That’s the hard, thankless work required to build trust in autonomous systems, to prove they can operate safely without a driver. To that point, the Bolt and MKZ rarely operate in their autonomous modes without a driver’s input. The team doesn’t need to let either car drive itself to know what it would do in a given situation. It all comes out in the data.

In a world that appreciates the next and newest, this may come off as restrictive. To many researchers, MSU’s included, it’s a responsibility that they know can cost lives if it’s not taken seriously.

MSU Professor Betty H.C. Cheng

“There’s a lot of push for autonomous vehicles. There’s never going to be a lack of motivation for our research efforts,” says Betty H.C. Cheng, a professor of computer science and engineering. “These new and shiny things are amazing, but we need to make sure safety and assurance come with them.”

Cheng has been working on a spectrum of techniques to address assurance, including cybersecurity, for autonomous systems for about two decades. That includes automobiles, but she’s also worked on projects with NASA, the U.S. Navy and the U.S. Air Force Research Lab.

For her, assurance means working to ensure that new software developed to run systems that humans depend on actually makes those systems better, without compromising safety when it encounters unforeseen problems.

When it comes to autonomous vehicles, there are many areas of assurance to probe. For example, how can researchers ensure their systems have enough training in the lab to make the right decisions on the road? And can artificial intelligence help prepare autonomous vehicles for possible scenarios that researchers may not anticipate or be able to test during development?

“There’s a broad range of sources of uncertainty on the road, for example, weather, road conditions, pedestrians, other drivers. So we can’t always achieve perfect behavior, but we need to have acceptable behavior,” Cheng says. “That means always making safety the number one priority.”

Through the work of Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, autonomous vehicles are using technologies such as lidar to sense and react to their environment.

An information infrastructure

That commitment to safety is also reflected in MSU’s autonomous bus, which was brought to campus in partnership with the companies Karsan and ADASTEC.

When the bus first started carrying passengers earlier this year, it became the first and only one in the U.S. operating at its level of autonomy with approval from the National Highway Traffic Safety Administration.

Sparty poses with MSU’s two autonomous cars and its autonomous bus. Credit: Derrick L. Turner

Unlike MSU’s Bolt and MKZ platforms, the bus is intended to drive itself at all times. But the bus does have a driver who can take control if needed as a precautionary measure. The bus also runs on a limited schedule, only on weekdays and exclusively during the daytime. Every intersection along the bus’s route also has cameras and electronics that communicate with the bus to help ensure safety.

All of this comes with a bonus: the bus is generating one-of-kind data while driving on campus.

For instance, it’s letting researchers observe how real people interact with real autonomous vehicles. This type of information is useful in an emerging field being pioneered by MSU called “sociomobility” that examines the societal impacts of the evolving mobility ecosystem.

Of course, the bus is also generating performance data akin to the Bolt and MKZ that is valuable to the entire autonomous vehicle research community. Researchers can, for example, analyze the bus’s interactions with the off-board electronics at intersections to inform the development of smart infrastructure off campus.

Qiuqi Cai (right) and Anshu Bamney (center) were doctoral students who worked with MSU Foundation Professor Peter Savolainen (left), a researcher whose interests include sociomobility. Both Cai and Bamney graduated in 2022. Credit: Courtesy of MSU College of Engineering

Researchers led by MSU’s Nizar Lajnef, an associate professor of civil and environmental engineering, have already deployed prototype sensors that keep tabs on the status of the Mackinac Bridge that connects Michigan’s two peninsulas. Future iterations of sensors like these could communicate information about roadway conditions to connected vehicles.

Researchers are also working on how pedestrians can contribute to a safer autonomous vehicle ecosystem. Arun A. Ross, the Martin J. Vanderploeg Professor of engineering, worked on a project that uses a car’s camera and software to anticipate pedestrian paths. Other researchers are bringing pedestrians into the ecosystem through their phones and related technology.

“A team of graduate students in my Advanced Topics in Automated Vehicles course used Bluetooth beacons to create a system that identifies cyclists that are over the horizon to improve passing safety in poor lighting or weather conditions,” says Josh Siegel, an assistant professor in the Department of Computer Science and Engineering at MSU.

Siegel’s lab is also working to teach autonomous vehicles to drive more defensively and, with the help of undergraduate professorial assistant Jacob Rutkowski, asking how autonomous vehicles will solve what’s known as the trolley problem.

As the name suggests, the dilemma was inspired by trolleys and not driverless cars, but the basic premise holds: If an autonomous vehicle comes to a situation in which a crash is inevitable and it has to choose what to hit — say, a commuter train or a school bus — how do ethics inform what it decides to do?

Siegel also has a variety of projects outside of autonomous vehicles, but they’re all connected by a unifying motivation. He wants his projects to be at the cutting edge of technology, but always with a clear focus on how that technology can benefit humanity.

“It’s fun to design in a vacuum,” Siegel says. “But thinking holistically is how we can have the best possible impact.”

Siegel is far from alone in this sentiment, which maybe isn’t surprising given how interdisciplinary MSU’s mobility research is. Michigan State has faculty working together to build the future of transportation with backgrounds in automotive engineering, computer science, public policy, user experience and more.

Each field brings a unique and essential line of inquiry that are all inextricably linked, which we see reflected in MSU’s autonomous vehicle research. Despite the volume and diversity of problems being solved, however, Satish Udpa, interim director of MSU Mobility, can distill their goals into a simple query.

“If I hail a taxi and an autonomous vehicle shows up, would I get in?” asks Udpa, a University Distinguished Professor of electrical and computer engineering. “Would my grandmother?”

He knows there aren’t a lot of people answering those with a “yes” right now. But he’s also confident that trust will grow in future generations thanks to the work MSU is doing today.



By: Matt Davenport, Deon Foster, Meredith Mescher and Anthony Siciliano

Media Contacts

COLLECTION

more content from this collection

Mobility