钻井自动化

社会应对自动化对人类的挑战

人为因素与人体工程学协会和石油工程师协会联合举办的网络研讨会讨论了人为因素在石油和天然气行业自动化中的作用。

钻井舱.jpeg
资料来源:纳博斯

继两个团体最近签署合作函后,人为因素与人体工程学协会 (HFES) 和石油工程师协会 (SPE) 携手解决自动化的人性化问题。尽管自动化正在石油和天然气行业蔓延,但人类并没有脱离这一循环——至少在很长一段时间内都是如此。而且他们不会很快,或者永远不会。

但人类是很棘手的,自动化的人类部分带来了一些挑战。

Marcin Nazaruk,SPE 人为因素技术部门前主席;Shashi Talya,哈里伯顿钻井自动化全球生产经理;SA Technologies 总裁、美国空军前首席科学家 Mica Endsley 在与主持人、德克萨斯 A&M 大学副教授 Camille Peres 和人为因素技术研究员 Julie Gilpin-McMinn 主持的网络研讨会上阐述了面临的挑战/Spirit Aerosystems 的人体工程学。

主要关注钻井,尽管这些挑战可能适用于所有自动化,但他提出了在钻井实施自动化时看到的三个关键挑战。

首先是人与现场自动化之间的连接。塔利亚说,司钻的小屋可能是“相当拥挤的地方”。没有足够的空间来添加更多信息源。“你无法在有限的空间内不断添加越来越多的信息。需要某种方式来整合并简洁地显示司钻感兴趣或想要了解的有关钻台上发生的情况的信息。

远程操作似乎是一个有吸引力的解决方案,但这本身也带来了挑战。

“钻井自动化的驱动因素之一是让我们能够进行远程执行,”塔利亚说,“最终,我们将让远程中心的人员能够自主、自动访问或查看钻井平台上正在发生的事情。”

这一切都很好,但是“想象一下在远程中心有类似的设置,但现在你试图在多个设备上完成相同的工作,”塔利亚说。“所以你的挑战突然成倍增加。如果您在显示的信息类型、人类与自动化系统的交互类型方面保持相同的现状,那么您会突然使人类或操作员面临大量信息超载。现在他们已经在做的事情之上了。”

是的,即使采用自动化,司钻仍在钻探。

“除了与自动化系统交互以及在自动化系统寻求帮助时试图弄清楚何时需要介入之外,人类仍然负责一些手动任务,”塔利亚说。“所以,从某种意义上说,你处于一种混合模式,人们不仅在做他们一直在做的手动任务,而且还在关注“我如何交互”有了这个自动化系统,就能识别正在发生的事情并根据需要采取行动?” “

这就是塔莉亚的第三个挑战:训练。“除了整个钻井过程之外,您现在还需要自动化系统本身的培训和能力,以便您在尝试执行操作时对自动化系统在后台执行的操作有相当合理或良好的了解。监控正在发生的整个过程,”他说。

“系统永远不会完全自主,”恩兹利补充道。“人们需要能够与他们互动并监督他们的表现。就像没有人是一座孤岛一样,没有自治权也不是一座孤岛。”

也许技术也可以解决这个问题。“我知道现在有一个大趋势,‘嗯,人工智能将解决所有这些问题,’”恩兹利说。“现实是事实并非如此。” 即使有了人工智能,它仍然只能在一定程度上做一些事情。我们仍然需要人们参与这个循环。”

人类参与自动化的持续需求意味着,与直觉相反,自动化程度的提高会增加人为因素的挑战。这就是恩斯利所说的自动化难题:“系统中添加的自动化越多,自动化越可靠和稳健,监督自动化的操作员就越不可能了解关键信息并能够接管需要时手动控制。”

恩兹利说,提高自动化系统的可靠性会有所帮助,但这会影响操作员如何分配注意力和参与度。更可靠的系统意味着操作员可以将注意力集中在工作的其他方面,并且可能会错过关键信息。恩兹利说,这一挑战“不是我们可以通过设计方法来解决的”。我们不能只是把它零碎地整合起来,然后加入自动化的部分。”

自动化系统也存在固有的挑战。在正确的时间获得正确的数据至关重要。“无论你的软件有多好,如果传感器没有正确获取数据,它就会出错,”她说。

Endsley 提出了解决这些问题的整体方法,她在 2017 年的论文《方法。

“我们真正要做的是审视整体,审视他们正在做的工作,其中自动化是其中的一部分,包括所有信息、他们正在管理的所有系统,然后说, “我们如何整体解决这个问题,以便整个系统有意义?”

“如果我们不仅让自动化更加透明,让自动化更加透明,让自动化能够提高人类绩效,而且还能让自动化更透明,让自动化更透明,让自动化更透明,让自动化更透明,让自动化更透明。”不久的将来。”

原文链接/jpt
Drilling automation

Societies Tackle Human Challenges of Automation

A joint webinar conducted by the Human Factors and Ergonomics Society and the Society of Petroleum Engineers addressed the role of human factors in automation in the oil and gas industry.

drillingcabin.jpeg
Source: Nabors

Following a recent signed letter of cooperation between the two groups, the Human Factors and Ergonomics Society (HFES) and the Society of Petroleum Engineers (SPE) came together to address the human side of automation. While automation is spreading in the oil and gas industry, humans are not out of the loop—not by a long shot. And they won’t be any time soon, or ever.

But humans are tricky, and the human part of automation creates some challenges.

Marcin Nazaruk, former chairperson of SPE’s Human Factors Technical Section; Shashi Talya, Halliburton’s global production manager for drilling automation; and Mica Endsley, president of SA Technologies and former chief scientist for the US Air Force, laid out the challenges in a webinar with moderators Camille Peres, an associate professor at Texas A&M University, and Julie Gilpin-McMinn, a technical fellow in human factors/ergonomics at Spirit Aerosystems.

Looking primarily at drilling—although the challenges can apply to all automation—Talya presented three key challenges he sees when implementing automation with drilling.

The first is the connection between the human and the automation on site. Driller’s cabins can be “pretty crowded real estate,” Talya said. There’s not a lot of room to add more sources of information. “You can’t keep adding more and more information in that limited space. There needs to be some way to consolidate and concisely show the information that the driller is interested in or wants to know in terms of what’s happening on the rig floor.”

Remote operation appears to be an attractive solution, but that creates a challenge of its own.

“One of the drivers for drilling automation is to enable us to do remote execution,” Talya said, “to get to a point where, eventually, we would have people in remote centers that are able to autonomously, automatically access or view what’s happening on the rig floor.”

That’s all fine and good, but “imagine having a similar setup in a remote center but now you’re trying to do the same work across multiple rigs,” Talya said. “So your challenge suddenly multiplies. If you maintain the same status quo in terms of type of information that’s being displayed, the type of interaction a human has with the automation system, then you’re suddenly overloading the human or the operator with a lot of information that they now have on top of what they’ve already been doing before.”

Yes, the driller is still drilling, even with automation.

“The human is still responsible for some of the manual tasks in addition to interfacing with the automation system and trying to figure out when he or she needs to step in when the automation system asks for help,” Talya said. “So, in a sense, you’re kind of in a hybrid mode where the human is not only doing their manual tasks that they’ve always been doing but they’re also looking at ‘how do I interface with this automation system and be able to recognize what’s going on and take action as needed?’ ”

Therein lies Talya’s third challenge: training. “In addition to the whole drilling process, you now need training and competency in the automation system itself to the extent that you have a pretty reasonable or good understanding of what the automation system is doing in the background while you’re trying to monitor the overall process that’s happening,” he said.

“Systems are never fully autonomous,” added Endsley. “People need to be able to interact with them and oversee their performance. Just like no man is an island, no autonomy is an island.”

Perhaps technology can solve this problem, too. “I know there’s a big trend now that, ‘Oh, AI [artificial intelligence] is going to solve all these problems,’ ” Endsley said. “The reality is it doesn’t. Even with AI, it still only does some things to a certain degree. We still need people to be part of that loop.”

The consistent need for humans to be engaged with the automation means that, counterintuitively, increasing automation increases the human factors challenges. This is what Endsley calls the automation conundrum: “The more automation is added to a system, and the more reliable and robust that automation, the less likely that human operators overseeing the automation will be aware of the critical information and able to take over manual control when needed.”

Increasing the reliability of automated systems helps, Endsley said, but that affects how operators allocate their attention and engagement. A more reliable system means operators put their attention on other aspects of the job and can miss critical information. This challenge, Endsley said, “is not something we can just engineer our way out of. We can’t just piecemeal it and snap in pieces of automation.”

The automation systems have built-in challenges, too. Having the right data at the right time is critical. “No matter how good your software is, it’ll go wrong if the data is not being correctly picked up by the sensors,” she said.

Endsley suggests a holistic approach to the problems, which she has presented in her 2017 paper “From Here to Autonomy: Lessons Learned From Human/Automation Research.”

“What we really have to do is look at the integrated whole and look at the job they’re doing where automation is a part of that with all of the information, all the systems that they’re managing, and say, ‘How do we approach this holistically so that the entire system makes sense?’

“We can do a lot to improve human performance with automation if we make it much more transparent about not only what it’s doing now but what it’s capable of doing and what it’s going to be able to do in the near future.”