Interview: The bias dilemma in construction AI

Premium Content

Artificial intelligence’s (AI) role in construction grows, but challenges with bias demand thoughtful solutions. Catrin Jones speaks with Karoliina Torttila, director of AI at Trimble on some of these solutions.

Artificial intelligence (AI) has transformed industries worldwide, and the construction sector is no exception. From optimising logistics to improving compliance checks, AI is shaping the built environment in ways previously unimaginable.

Karoliina Torttila, director of AI at Trimble Karoliina Torttila, director of AI at Trimble (Photo: Trimble)

For instance, AI-powered design tools can automate workflows, speeding up project timelines while reducing errors. Predictive maintenance systems leverage AI to identify potential equipment failures before they occur, ensuring worker safety and minimising downtime. Additionally, AI is transforming resource management, helping construction companies allocate materials more efficiently and sustainably.

However, with these advancements come challenges, particularly around bias. To explore this, Construction Europe spoke with Karoliina Torttila, director of AI at Trimble. With her extensive experience in AI development, Torttila offers insights into the causes of bias, its implications for the industry, and potential solutions.

Understanding bias in AI

“Bias is an essential concept in AI development,” Torttila begins. “While societal discussions often portray bias negatively, in AI, some level of inductive bias is necessary for machine learning models to function.” She explains that inductive bias allows AI models to make assumptions based on training data, enabling them to focus on plausible scenarios instead of being overwhelmed by infinite possibilities.

However, the problem arises with harmful biases stemming from incomplete or skewed training data. “Take geospatial mapping,” she says. “If training data lacks representation from remote areas or certain continents, AI models may produce lower-quality maps, leading to inadequate infrastructure planning or misallocated resources.”

She also points to compliance algorithms as another example. If trained on data from only one jurisdiction, these models could incorrectly flag or overlook issues due to regional regulatory discrepancies, she warns.

Bias in language models

Torttila delves into the nuances of bias in large language models (LLMs), which are increasingly used across industries. “Bias can enter at multiple stages of development, from the pre-training phase, where models learn language patterns, to the fine-tuning phase, where human input introduces subjective preferences.”

For instance, the human task of ranking model outputs for quality inherently introduces bias. “Even strict guidelines for annotators are based on someone’s interpretation,” she notes. Torttila also highlights challenges in safety alignment, where efforts to prevent harmful outputs can sometimes overcorrect. There have been examples of image generation models producing unexpected outputs due to these corrections.

The industry context

In construction, the implications of bias can be significant. Trimble’s AI applications often involve data about built environment assets, such as buildings and bridges, rather than personal information. This distinction doesn’t eliminate bias but changes its nature. “Inadequate data on newer materials or underrepresented regions can result in errors that affect entire projects,” Torttila explains.

She emphasises that the industry must approach bias with a constructive mindset. “Hunting for bias and punishing failures doesn’t foster improvement,” she asserts. Instead, fostering collaboration between stakeholders can drive progress.

“One positive development in the built environment, both in Europe and globally, is the growing number of consortiums bringing different parties together. This reflects a shared understanding that the challenges we face are too large for any single entity to tackle alone.”

When asked about solutions, Torttila is clear: data quality and transparency are critical. “Including diverse, high-quality datasets is a big part of mitigating bias,” she says. She also underscores the importance of building systems around AI models that include safeguards like human oversight or secondary AI checks.

Regulation is another area she believes will play a key role. “The European AI Act is an important step, but it’s crucial to strike the right balance,” she advises.

Overregulation could stifle innovation, limiting AI development to only the largest, best-funded companies. “Only companies like Microsoft, Google, OpenAI, and a few others developing this technology will be capable of managing it. Smaller startups, nonprofits, or even universities might not then be able to because the regulatory burden is so heavy.

“I think there’s a risk of hindering progress if regulation isn’t approached thoughtfully. That’s why legislating and regulating at the right level – whatever that level needs to be – must be the starting point,” she says.

The human-machine dynamic

Torttila highlights the importance of the interplay between humans and machines in mitigating bias. AI doesn’t operate in isolation. Most industries, including construction, are not yet at a point where its comfortable with fully autonomous systems making decisions. Human oversight remains essential, whether as a final checkpoint or as part of a broader feedback loop.

She also addresses accountability, a contentious issue in AI ethics. “Responsibility is shared between developers, users, and society at large,” she argues. Transparency and setting realistic expectations are vital. “Perfection is unattainable, but we’re going to start from somewhere, provide value and then hopefully we are collectively going to make things better.”

Torttila’s insights reveal both the challenges and opportunities of addressing bias in AI within the construction industry. As AI becomes increasingly integrated into the built environment, thoughtful approaches to data, regulation, and human-machine interaction will be critical. While the road ahead is complex, fostering collaboration and maintaining a constructive mindset can ensure AI continues to deliver transformative benefits to society, and construction.

STAY CONNECTED

Receive the information you need when you need it through our world-leading magazines, newsletters and daily briefings.

Sign up

CONNECT WITH THE TEAM
Andy Brown Editor, Editorial, UK - Wadhurst Tel: +44 (0) 1892 786224 E-mail: [email protected]
Neil Gerrard Senior Editor, Editorial, UK - Wadhurst Tel: +44 (0) 7355 092 771 E-mail: [email protected]
Catrin Jones Editor, Editorial, UK – Wadhurst Tel: +44 (0) 791 2298 133 E-mail: [email protected]
Eleanor Shefford Brand Manager Tel: +44 (0) 1892 786 236 E-mail: [email protected]