Why Aviation’s Digital Transformation Needs Human-Centered Design – A Conversation with Dr. Guido Carim Jr.

Dr. Carim Jr. has lived aviation from every angle. He started as a pilot, became a flight instructor, managed safety operations at a Brazilian airline while flying as a first officer, and now leads aviation research at Griffith University in Australia. He knows what works in theory and what actually works at 35,000 feet. Throughout […]

by Boyana Peeva

December 11, 2025

9 min read

Dr-Guido-Carim-Jr-State-of-Aviation-Dreamix1

Dr. Carim Jr. has lived aviation from every angle. He started as a pilot, became a flight instructor, managed safety operations at a Brazilian airline while flying as a first officer, and now leads aviation research at Griffith University in Australia. He knows what works in theory and what actually works at 35,000 feet.

Throughout his career, Carim Jr. has observed executives and tech vendors enthusiastically promote digital transformation while the people who use these systems – pilots, safety managers, operations teams – quietly struggle with poorly designed implementations. He's watched airlines rush to digitize without understanding how their people actually work, creating expensive solutions that sometimes make jobs harder, not easier.

In our conversation for Dreamix's "State of Aviation" column, he explains how aviation can embrace new technology while designing it to boost human capability rather than hinder or replace it.

Key Takeaways

  • Technology should augment, not replace human capability. The most critical aviation decisions require human adaptation and creativity. AI and automation excel at reducing workload from repetitive tasks, not making life-or-death choices.
  • Digitization without redesign creates new hazards. Converting paper manuals to PDFs solves weight problems but destroys the tactile navigation pilots relied on. True digital transformation means rethinking workflows, not just changing formats.
  • Studying normal operations reveals patterns that accidents alone can't show. One study found that cockpit distractions actually improved safety by keeping pilots more vigilant.
  • Legacy systems resist change for good reason. We can never fully test how new technology behaves in every situation. When the whole world uses the same systems, replacing them creates risks that coordinated global operations can't afford.
  • Adaptability matters as much as compliance. Brazilian pilots excel at problem-solving in ambiguous situations, while Australian pilots benefit from world-class training standards. The future needs both.

Q: You've had a fascinating journey from pilot to safety manager to academic researcher. How did you end up in aviation, and what shaped your focus on human factors?

Carim Jr.: I always wanted to become a pilot – as far back as I can remember, flying with my parents in commercial flights. I completed my bachelor's degree in aviation and received my pilot qualifications, became a flight instructor, then pursued a master's and PhD in industrial engineering focused on safety and production systems.

By the time I was accepted into the PhD program, I'd also been accepted as a first officer at an airline. One of the executives invited me to join the safety department, where I started a group focused on human factors – running programs on human factors training, line operations safety audits, and fatigue risk management systems. I flew there for six years while managing a team that grew to over 20 pilots. After six years, I moved to Australia to join Griffith University. I've been here for eight years now.

Q: When it comes to crisis management, what should airlines and airports prioritize to build resilience? Is technology even the answer?

Carim Jr.: We're working on a book chapter arguing that airports should shift from scenario-based planning to what we call societal resilience. There are many assumptions in planning that end up not being realistic.

I see technology as a way to enhance people's capability and readiness, not replace their capability. Technology should be used for tasks that don't add value – transcribing what happened, capturing experiences, reviewing documents. The idea that technology can replace people in critical tasks is completely flawed.

The problem with preparing for every single scenario is that what you plan for hardly becomes what you actually experience. That mismatch between plan and reality creates a gap that needs people's adaptation and skills to bridge – something technology cannot do (or can do under certain predictable conditions). If we have technology that helps people adapt, that would enhance our ability to respond to crises.

Q: In your research, you've talked about "trapping paper checklists into screens" – digitizing the same old procedures without rethinking them. What's the biggest mistake you see companies making when they go digital?

Carim Jr.: The need to digitize is clear – you don't want to carry 40 kilos of paper in a cockpit. And digitization has solved real problems. But what often happens is that companies convert everything to PDFs with search functions, which seems practical but creates unexpected challenges.

With paper manuals, pilots could use their fingers to mark different pages when dealing with complex problems, then flick between them quickly. With PDFs, you're clicking around between pages, and that tactile navigation is lost. You also can't highlight or keep your own annotated version alongside the company manual.

The search function requires very precise terminology – there's no Google-like suggestions. If you're using slightly different wording than the manufacturer, finding information becomes difficult. You often need to already know where something is located to retrieve it effectively.

Basically, every time we implement new technology, we change how people work, which creates new opportunities for mistakes. It takes time for people to adapt. AI will hopefully improve search functions, but it's an ongoing cycle – we solve one set of problems, discover new ones, adapt, then eventually replace the technology and start again.

Q: Where's the biggest gap between what legacy aviation systems can do and what modern operations need?

Carim Jr.: Aviation is very conservative, and there's a reason for that. Every time we have new technology, it creates uncertainties. We'll never have enough time to test it exhaustively, to the point where we know for sure how it behaves in every situation and how people make mistakes with it.

The second factor is scale. Aviation is an international activity in which everyone uses pretty much the same systems. When you have a new system, the whole world may have to adapt, which is a huge challenge.

The most iconic example is NOTAMs. They're extremely outdated – designed in a period when we didn't even have computers – but we're still using the same abbreviations, notation, and layout. There are many calls for change, but what we see is efforts to add AI systems to help pilots filter current NOTAMs rather than replacing the system entirely, which would require coordination among hundreds of countries and impact thousands of organisations worldwide.

It's easier to add another layer between pilots and the legacy system. I think that will create more problems than it solves. If it's not working, knock it down and build from scratch. But people tend to avoid these abrupt changes because of fear and uncertainty.

Q: What excites you most about aviation technology right now?

Carim Jr.: I was talking with a colleague about an AI system they developed – what we call RAG, Retrieval Augmented Generation. It's an AI with a structured database that retrieves specific information rather than pulling from the generic AI knowledge base (web information used to train the model).

One company had a nightmare scenario: different pilot contracts with different duty times, flight hours, days off, payment rules and rates. When you have 10 or 20 different contracts for pilots flying the same aircraft, it becomes impossible to manage manually. One pilot might be allowed to fly longer than another in the same cockpit because of their contract.

They developed an AI system that retrieves contract information and compares it instantly – determining which contract limits the crew for any given flight. It can handle complexity that would be a nightmare for humans to manage.

The second thing I see happening is AI assistants – like a copilot for information. For example, medical emergencies on board. Currently, we have to contact a doctor through a terrible high-frequency radio, trying to understand what they're saying while passing information to the crew in the back. What should take two minutes becomes hours.

I think we'll soon see medical emergency support systems in the cockpit helping pilots with the basics. Beyond life-or-death decisions, pilots also need to weigh costs. Landing at an alternate airport may cost $200,000. If it's not critical, would you spend that? AI can pull information from different areas and help make those decisions based on symptoms. There's always risk, but it's about making the right decisions more often than wrong ones.

Q: Most airline safety systems are built around what went wrong. Can we learn from the thousands of things that go right every day instead?

Carim Jr.: We have the tools and some processes. There's a great white paper by American Airlines on learning from everyday safety using LOSA – Line Operations Safety Audits – to capture what pilots got right and wrong, not just wrong.

We did something similar in the airline I worked for. We relabeled it "Line Observation Safety [Company Name]" because we wanted pilots observing in the cockpit just to describe what happened without judgment. The program ran a couple of times a year to monitor normal operations, but could also be deployed to answer specific questions. In one of the projects, marketing wanted the cockpit door open during boarding so passengers felt reassured, but pilots complained about noise and distraction coming particularly from the ground personnel (loud radios, constant conversations with the cabin crew, etc).

We ran a study – one month with the door closed, one month open, trained observers taking notes on mistakes, problems, and undesirable aircraft states. Surprisingly, the results showed the door closed was worse. When the door was closed, pilots had a sense of security – nothing was distracting them, so no need for double-checking. But during the flight, they'd discover mistakes and wrong configurations. With the door open, because they were constantly worried about distraction, they were constantly double-checking and triple-checking everything. The number of mistakes and wrong configurations reduced significantly during the flight.

There's a theory called risk homeostasis: when people perceive risk is increased, they tend to be more careful and reduce their pace. It's like in a car park – you see lots of people walking around, so you slow down, make eye contact, become more vigilant. That increases safety.

We collected data from over 600 flights – 300 door closed, 300 door open. Every indicator showed door open was better. The pilots didn't like the results at all, but the numbers were significant. It was a great example of how we can use safety management systems to collect data that's not just reactive but predictive.

Q: You've worked in both Brazilian and Australian aviation. What does each do really well?

Carim Jr.: Brazilian pilots are praised for being very flexible and adaptable. They come with a mindset of "I need to solve the problem" rather than "I can't do this because the regulation says so." Every now and then, situations go off-plan and you need to adapt. Some Australian pilots struggle with this because they're told to follow the rules all the time. They fail to recognize when a situation requires adaptation, creating conflict between following rules and achieving better outcomes.

However, the training system in Australia is much better. They have higher standards – going way above and beyond minimal training. In Brazil, minimal training is very ingrained in the culture. "What does the regulation say? That's the minimum, let's do the minimum." In Australia, you need to do what's best, regardless of the minimum. We don't even think about minimal – it's about the highest standard.

This speaks volumes about the quality of Australian aviation regulation. There's been no accident in commercial aviation on record here for a long time. Many countries send their future pilots to be trained in Australia because of these standards. The training system in Australia is really good, but Brazilian pilots have this adaptability as part of their culture, which helps in situations requiring problem-solving.

Q: What are you working on right now that keeps you up at night or gets you up in the morning?

Carim Jr.: There are three areas I'm passionate about, which I’ve been working on relentlessly.

First, pilot training. There's an opportunity to low-fidelity simulators – desktop simulators - in ab initio pilot training. They're fantastic tools but overlooked because student pilots can't reduce their required aircraft hours by using them. That's a very narrow view. People can become better pilots using these simulators. You get economy at scale, more people practicing together before getting to the aircraft, and you can simulate complex concepts like aerodynamics to help understanding.

Second, using technology to augment people's capabilities. We're testing an AI-driven system to help pilots retrieve information from manuals, do performance calculations, and help with troubleshooting. The idea of technology as an enabler of better capability.

Third, which is my main research area, is making safety management systems more effective and bespoke to companies. What I see is everybody copying practices from other companies because they're auditable. But it may not suit your organization or context. You have this beautiful system with all the manuals, but in reality, you don't use it.

One area I'm focused on is truly implementing risk management through safety management systems. Companies need multiple risk models for their operations. Then use reporting systems, investigations, and audits to feed the system. The idea is to identify patterns across events to predict whether there's a chance of experiencing an accident with certain traits.

My vision is that every company will have its own safety management system that doesn't look similar to any other company's. Everybody will have their own version. That's my goal.

Categories

A B2B marketer and journalist with a passion for tech and aviation. When I'm not diving into the latest industry trends, you'll find me enjoying meaningful conversations and exploring the great outdoors. I have a deep appreciation for the arts and a good glass of wine.