Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Fears over Boeing's plan to create AI-controlled killer jets for US military - despite slew of scandals

2 hours ago 1

Aerospace giant Boeing — which has already stranded two NASA astronauts in space this year — now plans to build lethal, AI-piloted fighter planes.

Their proposed fleet of 'un-crewed' killer aircraft, piloted by 'artificial intelligence' and dubbed MQ-28 Ghost Bats, would number in the thousands for the US alone.

Critics tell DailyMail.com the firm's plans raise concerns for public safety, national security and simply 'good use of of taxpayer funds.'

'Boeing's track record doesn't seem to indicate that it's necessarily the best one to implement this kind of thing,' as one former State Department official, Steven Feldstein, told DailyMail.com.

Boeing's MQ-28 Ghost Bat is an unmanned drone piloted by 'artificial intelligence' (AI). It is one of the several robotic fighter jets competing to become the Pentagon's killer AI drone fleet

Above, an early production photograph documenting the systems installation and integration testing of the AI-powered killer drone's landing gear

With roughly 53 cubic-feet of storage capacity within its nose for interchangeable payloads, Boeing's Ghost Bats could one day carry a variety of bombs and munitions including multiple tactical nuclear weapons. 

Currently, three prototypes of the Ghost Bat have been built and flight-tested in Australia for the Royal Australian Air Force (RAAF) with at least one of those delivered to United States for its own tests and integration trials.

But Boeing's Ghost Bat 'proof of concept' has already managed to prove itself to the RAAF, which has paid over $500million (USD) for another three MQ-28s — and footing the bill for the domestic manufacturing infrastructure to make more.

RAAF's next goal is having 10 Ghost Bats total by 2025 for actual, active military operations, and it has shouldered R&D costs for the privilege of one day being the first to grant the troubled aerospace firm's killer drone with 'strike capability.'

The Ghost Bat's bomb bay could comfortably, if hypothetically, hold multiple W80-style warheads, each seven-times more powerful than the 21-kiloton atomic bomb detonated over Nagasaki.

The current working prototypes of the Ghost Bat in both Australia and the US are 38 feet long, can fly more than 2,300 miles and are currently already capable of using 'artificial intelligence to fly independently,' according to a Boeing fact sheet.

All told, the United States Air Force's more ambitious and costly plans are to expend nearly $9 billion standing up and maintaining these futuristic AI-piloted fighter jets, according to The Potomac Officers Club.

America's operational, AI killing machines are expected to cost $30 million per drone, but the Pentagon is still entertaining bids from Boeing competitors for the final version: a longer timeline extending into 2029 and beyond.

So, while the US Air Force has requested a comparable $557 million this year to develop and test its AI fighter program as part of its FY2025 budget, the ultimate prize will be multibillion-dollar deals to manufacture this 1,000-jet-strong US AI fleet.

But for now, the Ghost Bat's Australian paymasters say they are focused less on its lethality than on its utility for airborne spycraft.

'We are very open for Ghost Bat to have strike capability,' Australia's Minister for Defense Industry and Capability Delivery, Patrick Conroy, said in August, 'we just want the intelligence, surveillance, and reconnaissance [ISR] first.'

'We are looking for immediate bang for buck, regarding ISR on the Ghost Bat.'

A prototype of Boeing's Ghost Bat (above) has already managed to prove itself to the Royal Australian Air Force - which has paid over $531 million (USD) for the privilege of one day arming the troubled aerospace firm's killer AI drone fleet with 'strike capability'

Above, an image of the first completed Ghost Bat fuselage for Australia's 'Loyal Wingman' program, as released to the public by Boeing Australia in 2020  

Feldstein — the former State Dept. official, now a senior fellow in the Carnegie Endowment's Democracy, Conflict and Governance Program — told DailyMail.com he questions whether the scandal-plagued company is right for the job.

'Just looking at its track record when it comes to emerging, cutting-edge technologies, most recently its troubles when it comes to spaceflight, but with other products as well,' Feldstein said, 'it doesn't have a really strong record when it comes to doing innovative things.'

Over the past several years, multiple models of Boeing's commercial passenger jets, in fact, have faced door plug blowoutsmid-air engine fires, and two deadly crashes which killed 346 people.

Feldstein, who has written on the moral threat of AI warfare for the Bulletin of Atomic Scientists, noted Boeing's history in defense raises similar worries.   

'Oftentimes, its products seem to have cost overruns and seem to be only kind of marginally effective when it comes to actually accomplishing goals,' he said.

Mary Wareham — advocacy director for the arms division of nonprofit Human Rights Watch — questioned the entire premise of the contracts that Boeing is competing for in the first place, producing what the Pentagon calls 'lethally autonomous weapons.' 

'You're stepping over a moral line by outsourcing killing to machines,' she told the New York Times, 'allowing computer sensors rather than humans to take human life.'

America's operational, AI killing machines - reminiscent of the rogue SkyNet from the 'Terminator' film series - are expected to cost $30 million per drone

DailyMail.com reached out to Boeing for comments on the wider ethical dimensions of AI-piloted fighter jets, and its MQ-28 in particular, but has not heard back. 

In the early years of the US Air Force's CAA program, Boeing enjoyed a clear lead on its rivals to one day score the lucrative award to build America's killer AI drone fleet.

And unlike its competitors — Lockheed Martin, Northrop Grumman, General Atomics and AI start-up Anduril — Boeing actually has working models of its fighter in the air.

And yet, likely in a sign of Boeing's growing reputation for mismanagement, greed and fatal flight safety issues, its Ghost Bat lost out on this year's round of the Air Force's CAA funding to proposals by Anduril and drone-maker General Atomics.

Robert Gonzalez, who tracked the movements of retiring Pentagon officials to high-tech venture capital (VC) funds for the Watson Institute for International and Public Affairs, sees troubling incentives to generate 'hype' about the promise of AI in war.

'I'm not the first one to make this argument, but a lot of this hype is meant to inflate stock prices or to inflate the value of startups,' Gonzalez told DailyMail.com.

'So, there's a real financial interest in generating that kind of euphoria or optimism about AI's potential to solve all kinds of problems.'

'We are very open for Ghost Bat to have strike capability,' Australia's Minister for Defense Industry and Capability Delivery, Patrick Conroy, said in August. Above, one of Australia's Ghost Bat prototypes on a test flight

'Legacy firms, for example, Lockheed Martin and Boeing,' he said, 'have their own venture capital arms, so they have the branches of their corporations that are basically dedicated to providing funding for startups.'

'Boeing's venture capital arm is HorizonX, and HorizonX basically has a portfolio of companies that include a whole bunch of AI startups.'

The case of the Pentagon's $2 Trillion boondoggle to turn Lockheed Martin's F-35 stealth fighter into a viable air war fighting force, for example, shows how one giant defense contractor can weather the reputational harm of failing to deliver the goods.

A series of smaller, more fly-by-night and seemingly independent VC-funded start-ups in this ecosystem may prove equally adept at over promising and underdelivering, he noted. 

One reason why, Gonzalez said, is the 'Halo Effect' in which US defense or intelligence community-funded start-ups can use that prestige to raise more money from the private sector, as happened with the CIA's VC-funding arm In-Q-Tel.

'According to In-Q-Tel, on average every dollar it invests in a company is leveraged into $28 of private VC funding,' according to Gonzalez. 

But. for his part, Feldstein noted that there are already troubling signs of AI's use in warfare in the current conflict in Gaza, where Israel's AI system Lavender has been implicated in the killing of dozens of innocent civilians per each alleged Hamas enemy combatant. 

Above, a Boeing MQ-28 Ghost Bat prototype - which is already in use in Australia

'In my eyes, it's sort of like putting a self-driving car on the road without having a clear sense of its failure rates, and hoping it works out fine,' Feldstein told DailyMail.com, 'but not having any verifiable data to go on when it comes to knowing if this machine is safe for bystanders.'

'Same thing with this AI targeting. You know, it is being used in a real life conflict, and yet, our understanding of how accurate the system is, our understanding of the safeguards in place, is very limited.' 

'And, anecdotally, what we've heard are some concerning reports that individuals are being targeted or killed who have little to do with the war itself,' he said.

Feldstein, however, does not think that a future of war without AI is likely, given the incentives for rival nation-state powers. 

But it's his hope that international norms on the technology's use can be maintained, drawing on examples from the Cold War. 

'The way forward is to keep developing these systems, keep testing them, validate their effectiveness, build in place the right safeguards, and hopefully get other countries to agree to minimum floors when it comes to conduct,' he explained.

'We've seen this play out when it comes to nuclear weapons or chemical weapons.' 

'Countries have agreed — even though they are rivals — that it is in their interest to establish basic norms of conduct, rather than just kind of allow a 'Wild West' to occur and potentially result in catastrophic harm,' as he put it to DailyMail.com

'I hope a similar behavioral pattern plays out with regard to AI weapons,' he added, 'even if we find ourselves competing very vigorously with China or Russia or Iran.'

Read Entire Article