movies

why-darren-aronofsky-thought-an-ai-generated-historical-docudrama-was-a-good-idea

Why Darren Aronofsky thought an AI-generated historical docudrama was a good idea


We hold these truths to be self-evident

Production source says it takes “weeks” to produce just minutes of usable video.

Artist’s conception of critics reacting to the first episodes of “On This Day… 1776” Credit: Primordial Soup

Artist’s conception of critics reacting to the first episodes of “On This Day… 1776” Credit: Primordial Soup

Last week, filmmaker Darren Aronofsky’s AI studio Primordial Soup and Time magazine released the first two episodes of On This Day… 1776. The year-long series of short-form videos features short vignettes describing what happened on that day of the American Revolution 250 years ago, but it does so using “a variety of AI tools” to produce photorealistic scenes containing avatars of historical figures like George Washington, Thomas Paine, and Benjamin Franklin.

In announcing the series, Time Studios President Ben Bitonti said the project provides “a glimpse at what thoughtful, creative, artist-led use of AI can look like—not replacing craft but expanding what’s possible and allowing storytellers to go places they simply couldn’t before.”

The trailer for “On This Day… 1776.”

Outside critics were decidedly less excited about the effort. The AV Club took the introductory episodes to task for “repetitive camera movements [and] waxen characters” that make for “an ugly look at American history.” CNET said that this “AI slop is ruining American history,” calling the videos a “hellish broth of machine-driven AI slop and bad human choices.” The Guardian lamented that the “once-lauded director of Black Swan and The Wrestler has drowned himself in AI slop,” calling the series “embarrassing,” “terrible,” and “ugly as sin.” I could go on.

But this kind of initial reaction apparently hasn’t deterred Primordial Soup from its still-evolving efforts. A source close to the production, who requested anonymity to speak frankly about details of the series’ creation, told Ars that the quality of new episodes would improve as the team’s AI tools are refined throughout the year and as the team learns to better use them.

“We’re going into this fully assuming that we have a lot to learn, that this process is gonna evolve, the tools we’re using are gonna evolve,” the source said. “We’re gonna make mistakes. We’re gonna learn a lot… we’re going to get better at it, [and] the technology will change. We’ll see how audiences are reacting to certain things, what works, what doesn’t work. It’s a huge experiment, really.”

Not all AI

It’s important to note that On This Day… 1776 is not fully crafted by AI. The script, for instance, was written by a team of writers overseen by Aronofsky’s longtime writing partners Ari Handel and Lucas Sussman, as noted by The Hollywood Reporter. That makes criticisms like the Guardian’s of “ChatGPT-sounding sloganeering” in the first episodes both somewhat misplaced and hilariously harsh.

Our production source says the project was always conceived as a human-written effort and that the team behind it had long been planning and researching how to tell this kind of story. “I don’t think [they] even needed that kind of help or wanted that kind of [AI-powered writing] help,” they said. “We’ve all experimented with [AI-powered] writing and the chatbots out there, and you know what kind of quality you get out of that.”

What you see here is not a real human actor, but his lines were written and voiced by humans.

What you see here is not a real human actor, but his lines were written and voiced by humans. Credit: Primordial Soup

The producers also go out of their way to note that all the dialogue in the series is recorded directly by Screen Actors Guild voice actors, not by AI facsimiles. While recently negotiated union rules might have something to do with that, our production source also said the AI-generated voices the team used for temp tracks were noticeably artificial and not ready for a professional production.

Humans are also directly responsible for the music, editing, sound mixing, visual effects, and color correction for the project, according to our source. The only place the “AI-powered tools” come into play is in the video itself, which is crafted with what the announcement calls a “combination of traditional filmmaking tools and emerging AI capabilities.”

In practice, our source says, that means humans create storyboards, find visual references for locations and characters, and set up how they want shots to look. That information, along with the script, gets fed into an AI video generator that creates individual shots one at a time, to be stitched together and cleaned up by humans in traditional post-production.

That process takes the AI-generated cinema conversation one step beyond Ancestra, a short film Primordial Soup released last summer in association with Google DeepMind (which is not involved with the new project). There, AI tools were used to augment “live-action scenes with sequences generated by Veo.”

“Weeks” of prompting and re-prompting

In theory, having an AI model generate a scene in minutes might save a lot of time compared to traditional filmmaking—scouting locations, hiring actors, setting up cameras and sets, and the like. But our production source said the highly iterative process of generating and perfecting shots for On This Day… 1776 still takes “weeks” for each minutes-long video and that “more often than not, we’re pushing deadlines.”

The first episode of On this Day… 1776 features a dramatic flag raising.

Even though the AI model is essentially animating photorealistic avatars, the source said the process is “more like live action filmmaking” because of the lack of fine-grained control over what the video model will generate. “You don’t know if you’re gonna get what you want on the first take or the 12th take or the 40th take,” the source said.

While some shots take less time to get right than others, our source said the AI model rarely produces a perfect, screen-ready shot on the first try. And while some small issues in an AI-generated shot can be papered over in post-production with visual effects or careful editing, most of the time, the team has to go back and tell the model to generate a completely new video with small changes.

“It still takes a lot of work, and it’s not necessarily because it’s wrong, per se, so much as trying to get the right control because you [might] want the light to land on the face in the right way to try to tell the story,” the source said. “We’re still, we’re still striving for the same amount of control that we always have [with live-action production] to really maximize the story and the emotion.”

Quick shots and smaller budgets

Though video models have advanced since the days of the nightmarish clip of Will Smith eating spaghetti, hallucinations and nonsensical images are “still a problem” in producing On This Day… 1776, according to our source. That’s one of the reasons the company decided to use a series of short-form videos rather than a full-length movie telling the same essential story.

“It’s one thing to stay consistent within three minutes. It’s a lot harder and it takes a lot more work to stay consistent within two hours,” the source said. “I don’t know what the upper limit is now [but] the longer you get, the more things start to fall off.”

Stills from an AI-generated video of Will Smith eating spaghetti.

We’ve come a long way from the circa-2023 videos of Will Smith eating spaghetti.

We’ve come a long way from the circa-2023 videos of Will Smith eating spaghetti. Credit: chaindrop / Reddit

Keeping individual shots short also allows for more control and fewer “reshoots” for an AI-animated production like this. “When you think about it, if you’re trying to create a 20-second clip, you have all these things that are happening, and if one of those things goes wrong in 20 seconds, you have to start over,” our source said. “And the chance of something going wrong in 20 seconds is pretty high. The chance of something going wrong in eight seconds is a lot lower.”

While our production source couldn’t give specifics on how much the team was spending to generate so much AI-modeled video, they did suggest that the process was still a good deal cheaper than filming a historical docudrama like this on location.

“I mean, we could never achieve what we’re doing here for this amount of money, which I think is pretty clear when you watch this,” they said. In future episodes, the source promised, “you’ll see where there’s things that cameras just can’t even do” as a way to “make the most of that medium.”

“Let’s see what we can do”

If you’ve been paying attention to how fast things have been moving with AI-generated video, you might think that AI models will soon be able to produce Hollywood-quality cinema with nothing but a simple prompt. But our source said that working on On This Day… 1776 highlights just how important it is for humans to still be in the loop on something like this.

“Personally, I don’t think we’re ever gonna get there [replacing human editors],” he said. “We actually desperately need an editor. We need another set of eyes who can look at the cut and say, ‘If we get out of this shot a little early, then we can create a little bit of urgency. If we linger on this thing a little longer…’ You still really need that.”

AI Ben Franklin and AI Thomas Paine toast to the war propaganda effort.

AI Ben Franklin and AI Thomas Paine toast to the war propaganda effort. Credit: Primordial Soup

That could be good news for human editors. But On This Day… 1776 also suggests a world where on-screen (or even motion-captured) human actors are fully replaced by AI-generated avatars. When I asked our source why the producers felt that AI was ready to take over that specifically human part of the film equation, though, the response surprised me.

“I don’t know that we do know that, honestly,” they said. “I think we know that the technology is there to try. And I think as storytellers we’re really interested in using… all the different tools that we can to try to get our story across and to try to make audiences feel something.”

“It’s not often that we have huge new tools like this,” the source continued. “I mean, it’s never happened in my lifetime. But when you do [get these new tools], you want to start playing with them… We have to try things in order to know if it works, if it doesn’t work.”

“So, you know, we have the tools now. Let’s see what we can do.”

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Why Darren Aronofsky thought an AI-generated historical docudrama was a good idea Read More »

tcl-tvs-will-use-films-made-with-generative-ai-to-push-targeted-ads

TCL TVs will use films made with generative AI to push targeted ads

Advertising has become a focal point of TV software. We’re seeing companies that sell TV sets be increasingly interested in leveraging TV operating systems (OSes) for ads and tracking. This has led to bold new strategies, like an adtech firm launching a TV OS and ads on TV screensavers.

With new short films set to debut on its free streaming service tomorrow, TV-maker TCL is positing a new approach to monetizing TV owners and to film and TV production that sees reduced costs through reliance on generative AI and targeted ads.

TCL’s five short films are part of a company initiative to get people more accustomed to movies and TV shows made with generative AI. The movies will “be promoted and featured prominently on” TCL’s free ad-supported streaming television (FAST) service, TCLtv+, TCL announced in November. TCLtv+has hundreds of FAST channels and comes on TCL-brand TVs using various OSes, including Google TV and Roku OS.

Some of the movies have real actors. You may even recognize some, (like Kellita Smith, who played Bernie Mac’s wife, Wanda, on The Bernie Mac Show). Others feature characters made through generative AI. All the films use generative AI for special effects and/or animations and took 12 weeks to make, 404 Media, which attended a screening of the movies, reported today. AI tools used include ComfyUI, Nuke, and Runway, 404 reported. However, all of the TCL short movies were written, directed, and scored by real humans (again, including by people you may be familiar with). At the screening, Chris Regina, TCL’s chief content officer for North America, told attendees that “over 50 animators, editors, effects artists, professional researchers, [and] scientists” worked on the movies.

I’ve shared the movies below for you to judge for yourself, but as a spoiler, you can imagine the quality of short films made to promote a service that was created for targeted ads and that use generative AI for fast, affordable content creation. AI-generated videos are expected to improve, but it’s yet to be seen if a TV brand like TCL will commit to finding the best and most natural ways to use generative AI for video production. Currently, TCL’s movies demonstrate the limits of AI-generated video, such as odd background imagery and heavy use of narration that can distract from badly synced audio.

TCL TVs will use films made with generative AI to push targeted ads Read More »

ai-generated-shows-could-replace-lost-dvd-revenue,-ben-affleck-says

AI-generated shows could replace lost DVD revenue, Ben Affleck says

Last week, actor and director Ben Affleck shared his views on AI’s role in filmmaking during the 2024 CNBC Delivering Alpha investor summit, arguing that AI models will transform visual effects but won’t replace creative filmmaking anytime soon. A video clip of Affleck’s opinion began circulating widely on social media not long after.

“Didn’t expect Ben Affleck to have the most articulate and realistic explanation where video models and Hollywood is going,” wrote one X user.

In the clip, Affleck spoke of current AI models’ abilities as imitators and conceptual translators—mimics that are typically better at translating one style into another instead of originating deeply creative material.

“AI can write excellent imitative verse, but it cannot write Shakespeare,” Affleck told CNBC’s David Faber. “The function of having two, three, or four actors in a room and the taste to discern and construct that entirely eludes AI’s capability.”

Affleck sees AI models as “craftsmen” rather than artists (although some might find the term “craftsman” in his analogy somewhat imprecise). He explained that while AI can learn through imitation—like a craftsman studying furniture-making techniques—it lacks the creative judgment that defines artistry. “Craftsman is knowing how to work. Art is knowing when to stop,” he said.

“It’s not going to replace human beings making films,” Affleck stated. Instead, he sees AI taking over “the more laborious, less creative and more costly aspects of filmmaking,” which could lower barriers to entry and make it easier for emerging filmmakers to create movies like Good Will Hunting.

Films will become dramatically cheaper to make

While it may seem on its surface like Affleck was attacking generative AI capabilities in the tech industry, he also did not deny the impact it may have on filmmaking. For example, he predicted that AI would reduce costs and speed up production schedules, potentially allowing shows like HBO’s House of the Dragon to release two seasons in the same period as it takes to make one.

AI-generated shows could replace lost DVD revenue, Ben Affleck says Read More »

apple-is-turning-the-oregon-trail-into-a-movie

Apple is turning The Oregon Trail into a movie

Apple will adapt the classic educational game The Oregon Trail into a big-budget movie, according to The Hollywood Reporter (THR).

The film is in early development, having just been pitched to Apple and approved. Will Speck and Josh Gordon (Blades of GloryOffice Christmas Party) will direct and produce. Given that pedigree (zany comedies), it’s clear this film won’t be a serious historical drama about the struggles of those who traveled the American West.

In fact, the report not only notes that it will be a comedy—it says it will be a musical, too. “The movie will feature a couple of original musical numbers in the vein of Barbie,” according to THR’s sources. EGOT winners Benj Pasek and Justin Paul will be responsible for the original music in the film.

Of course, with a comedy, the writers are at least as important as the director. The film will be written by Kenneth and Keith Lucas—but they’re most recently best known for the 2021 drama Judas and the Black Messiah, for which they received an Oscar nomination.

That’s all we know about the film so far. As for the game, well, it needs no introduction—especially for folks who were of the appropriate age to play it at school or at home on personal computers from the 1970s through the 1990s.

The game is a major cultural touchstone for a certain generation—to the point that “The Oregon Trail Generation” has been used as a label for many of the people born in the early 1980s. It’s long been a thing to joke about the game’s morbid content, like the infamous phrase: “You have died of dysentery.”

Since the film was greenlit by Apple, it’s likely to debut on the Apple TV+ streaming service, but we don’t yet know when it will arrive or who will star in it.

Apple is turning The Oregon Trail into a movie Read More »

report:-apple-changes-film-strategy,-will-rarely-do-wide-theatrical-releases

Report: Apple changes film strategy, will rarely do wide theatrical releases

Small screen focus —

Apple TV+ has made more waves with TV shows than movies so far.

George Clooney and Brad Pitt stand in a doorway

Enlarge / A still from Wolfs, an Apple-produced film starring George Clooney and Brad Pitt.

Apple

For the past few years, Apple has been making big-budget movies meant to compete with the best traditional Hollywood studios have to offer, and it has been releasing them in theaters to drive ticket sales and awards buzz.

Much of that is about to change, according to a report from Bloomberg. The article claims that Apple is “rethinking its movie strategy” after several box office misfires, like Argylle and Napoleon.

It has already canceled the wide theatrical release of one of its tent pole movies, the George Clooney and Brad Pitt-led Wolfs. Most other upcoming big-budget movies from Apple will be released in just a few theaters, suggesting the plan is simple to ensure continued awards eligibility but not to put butts in seats.

Further, Apple plans to move away from super-budget films and to focus its portfolio on a dozen films a year at lower budgets. Just one major big-budget film is planned to get a wide theatrical release: F1. How that one performs could inform future changes to Apple’s strategy.

The report notes that Apple is not the only streamer changing its strategy. Netflix is reducing costs and bringing more movie production in-house, while Amazon is trying (so far unsuccessfully) to produce a higher volume of movies annually, but with a mixture of online-only and in-theater releases. It also points out that movie theater chains are feeling ever more financial pressure, as overall ticket sales haven’t matched their pre-pandemic levels despite occasional hits like Inside Out 2 and Deadpool & Wolverine.

Cinemas have been counting on streamers like Netflix and Apple to crank out films, but those hopes may be dashed if the media companies continue to pull back. For the most part, tech companies like Apple and Amazon have had better luck gaining buzz with television series than with feature films.

Report: Apple changes film strategy, will rarely do wide theatrical releases Read More »

report:-apple-tv+-will-soon-get-a-lot-more-movies-made-by-studios-other-than-apple

Report: Apple TV+ will soon get a lot more movies made by studios other than Apple

Streaming services —

Apple TV+ series have made an impact, but its films have been less successful lately.

A photo of a TV showing the landing page for Argylle in the Apple TV+ app

Enlarge / Apple seeks to continue to augment its library of original films like Argylle with films from other studios.

Apple TV+ has carved a niche for itself with strong original programming, and while it’s still far behind the likes of Netflix in terms of subscribers, it has seen a fairly strong initial run. To build on that, Apple is talking with major studios about ways to complement its slate of original programming with films from other companies in order to expand and extend the service’s appeal.

That’s according to Bloomberg reporters Lucas Shaw and Thomas Buckley, who cite people familiar with Apple’s workings. Those sources say Apple is “having discussions” with more than one large film studio about bringing more movies to the service.

Apple previously experimented with this by licensing around 50 movies and making them available on the service for limited runs over the past several months. That experiment seems to have gone well, leading Apple to begin laying the groundwork for expanding on that.

That test run was just in the United States. Bloomberg claims the focus this time is international, with the possibility of new films not just in the US but in other regions, too.

Hollywood studios have reportedly been anticipating this move. As you may have noticed amid the numerous subscription service price hikes, media companies have begun putting greater emphasis on profitability after the conclusion of a long period where subscriber growth at any cost was the goal. Licensing deals like this can help with that new goal.

It’s worth noting that while Apple has found some big successes in terms of series (Ted Lasso, Severance, The Morning Show) it has struggled to make as much of an impact with its movies. Despite big stars and budgets, the films have not always made as much cultural impact as the shows.

That means that bringing in films from studios with a more proven record can be a win-win: It will help Apple bolster the TV+ subscription service while generating revenue for film studios that are struggling to keep up in the new era.

Services like TV+ are a growing part of Apple’s business, which has historically been focused on hardware sales. In the second quarter of its 2024 fiscal year, the services bucket accounted for $23.9 billion in quarterly revenue, which is more than half the revenue generated by iPhone hardware sales.

Report: Apple TV+ will soon get a lot more movies made by studios other than Apple Read More »

openai-shows-off-sora-ai-video-generator-to-hollywood-execs

OpenAI shows off Sora AI video generator to Hollywood execs

No lights, no camera, action —

CEO Sam Altman met with Universal, Paramount, and Warner Bros Discovery.

a robotic intelligence works as a cameraman (3d rendering)

OpenAI has launched a charm offensive in Hollywood, holding meetings with major studios including Paramount, Universal, and Warner Bros Discovery to showcase its video generation technology Sora and allay fears the artificial intelligence model will harm the movie industry.

Chief Executive Sam Altman and Chief Operating Officer Brad Lightcap gave presentations to executives from the film industry giants, said multiple people with knowledge of the meetings, which took place in recent days.

Altman and Lightcap showed off Sora, a new generative AI model that can create detailed videos from simple written prompts.

The technology first gained Hollywood’s attention after OpenAI published a selection of videos produced by the model last month. The clips quickly went viral online and have led to debate over the model’s potential impact on the creative industries.

“Sora is causing enormous excitement,” said media analyst Claire Enders. “There is a sense it is going to revolutionize the making of movies and bring down the cost of production and reduce the demand for [computer-generated imagery] very strongly.”

AI-generated video of a cat and human, generated via video generation model Sora.

Those involved in the meetings said OpenAI was seeking input from the film bosses on how Sora should be rolled out. Some who watched the demonstrations said they could see how Sora or similar AI products could save time and money on production but added the technology needed further development.

OpenAI’s overtures to the studios come at a delicate moment in Hollywood. Last year’s monthslong strikes ended with the Writers Guild of America and the Screen Actors Guild securing groundbreaking protections from AI in their contracts. This year, contract negotiations are underway with the International Alliance of Theatrical Stage Employees—and AI is again expected to be a hot-button issue.

Earlier this week, OpenAI released new Sora videos generated by a number of visual artists and directors, including short films, as well as their impressions of the technology. The model will aim to compete with several available text-to-video services from start-ups, including Runway, Pika, and Stability AI. These other services already offer commercial uses for content.

An AI-generated video from Sora of a dog.

However, Sora has not been widely released. OpenAI has held off announcing a launch date or the circumstances under which it will be available. One person with knowledge of its strategy said the company was deciding how to commercialize the technology. Another person said there were safety steps still to take before the company considered putting Sora into a product.

OpenAI is also working to improve the system. Currently, Sora can only make videos under one minute in length, and its creations have limitations, such as glass bouncing off the floor instead of shattering or adding extra limbs to people and animals.

Some studios appeared open to using Sora in filmmaking or TV production in future, but licensing and partnerships have not yet been discussed, said people involved in the talks.

“There have been no meetings with OpenAI about partnerships,” one studio executive said. “They’ve done demos, just like Apple has been demo-ing the Vision Pro [mixed-reality headset]. They’re trying to get people excited.”

OpenAI has been previewing the model in a “very controlled manner” to “industries that are likely to be impacted first,” said one person close to OpenAI.

Media analyst Enders said the reception from the movie industry had been broadly optimistic on Sora as it is “seen completely as a cost-saving element, rather than impacting the creative ethos of storytelling.”

OpenAI declined to comment.

An AI-generated video from Sora of a woman walking down a Tokyo street.

© 2024 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

OpenAI shows off Sora AI video generator to Hollywood execs Read More »

debt-laden-warner-bros.-discovery-and-paramount-consider-merger

Debt-laden Warner Bros. Discovery and Paramount consider merger

Game of Thrones

Enlarge / Media firms are looking for allies to help them take the coveted media throne.

The CEOs of Warner Bros. Discovery (WBD) and Paramount Global discussed a potential merger on Tuesday, according to a report from Axios citing “multiple” anonymous sources. No formal talks are underway yet, according to The Wall Street Journal. But the discussions look like the start of consolidation discussions for the media industry during a tumultuous time of forced evolution.

On Wednesday, Axios reported that WBD head David Zaslav and Paramount head Bob Bakish met in Paramount’s New York City headquarters for “several hours.”

Zaslav and Shari Redstone, owner of Paramount’s parent company National Amusements Inc (NAI), have also spoken, Axios claimed.

One of the publication’s sources said a WBD acquisition of NAI, rather than only Paramount Global, is possible.

Talks to unite the likes of Paramount’s film studio, Paramount+ streaming service, and TV networks (including CBS, BET, Nickelodeon, and Showtime) with WBD’s Max streaming service, CNN, Cinemax, and DC Comics properties are reportedly just talks, but Axios said WBD “hired bankers to explore the deal.”

It’s worth noting that WBD will suffer a big tax hit if it engages in merger and acquisition activity before April 8 due to a tax formality related to Discovery’s merger with WarnerMedia (which formed Warner Bros. Discovery) in 2022.

A union of debts

Besides the reported talks being in very early stages, there are reasons to be skeptical about a WBD and Paramount merger. The biggest one? Debt.

The New York Times notes that WBD has $40 billion in debt and $5 billion in free cash flow. Paramount, meanwhile, has $15 billion in debt and a negative cash flow. Zaslav has grown infamous for slashing titles and even enacting layoffs to save costs. But WBD is eyeing greener pastures and declared Max as “getting slightly profitable” in October. Adding more debt to WBD’s plate could be viewed as a step backward.

Additionally, Paramount is even more connected to old, flailing forms of media than WBD, as noted by The Information, which pointed to two-thirds of Paramount’s revenue coming from traditional TV networks.

Antitrust concerns could also impact such a deal.

WBD stocks closed down 5.7 percent, and Paramount’s closed down 2 percent after Axios’ report broke.

Of course, these details about a potential merger may have been reported because WBD and/or Paramount want us to know about it so that they can gauge market reaction and/or entice other media companies to discuss potential deals.

Debt-laden Warner Bros. Discovery and Paramount consider merger Read More »

“privacy-lost”:-new-short-film-shows-metaverse-concerns

“PRIVACY LOST”: New Short Film Shows Metaverse Concerns

Experts have been warning that, as exciting as AI and the metaverse are, these emerging technologies may have negative effects if used improperly. However, it seems like the promise of these technologies may be easier to convey than some of the concerns. A new short film, titled PRIVACY LOST, is a theatrical exploration of some of those concerns.

To learn more, ARPost talked with the writer of PRIVACY LOST – CEO and Chief Scientist of Unanimous AI and a long-time emerging technology engineer and commentator, Dr. Louis Rosenberg.

PRIVACY LOST

Parents and their son sit in a restaurant. The parents are wearing slim AR glasses while the child plays on a tablet.

As the parents argue with one another, their glasses display readouts of the other’s emotional state. The husband is made aware when his wife is getting angry and the wife is made aware when her husband is lying.

privacy lost movie emotions

A waiter appears and the child puts down the tablet and puts on a pair of AR glasses. The actual waiter never appears on screen but appears to the husband as a pleasant-looking tropical server, to the wife as a fit surf-bro, and to the child as an animated stuffed bear.

privacy lost movie sales

Just as the husband and wife used emotional information about one another to try to navigate their argument, the waiter uses emotional information to try to most effectively sell menu items – aided through 3D visual samples. The waiter takes drink orders and leaves. The couple resumes arguing.

privacy lost movie purchase probability

PRIVACY LOST presents what could be a fairly typical scene in the near future. But, should it be?

“It’s short and clean and simple, which is exactly what we aimed for – a quick way to take the complex concept of AI-powered manipulation and make it easily digestible by anyone,” Rosenberg says of PRIVACY LOST.

Creating the Film

“I’ve been developing VR, AR, and AI for over 30 years because I am convinced they will make computing more natural and human,” said Rosenberg. “I’m also keenly aware that these technologies can be abused in very dangerous ways.”

For as long as Rosenberg has been developing these technologies, he has been warning about their potential societal ramifications. However, for much of that career, people have viewed his concerns as largely theoretical. As first the metaverse and now AI have developed and attained their moments in the media, Rosenberg’s concerns take on a new urgency.

“ChatGPT happened and suddenly these risks no longer seemed theoretical,” said Rosenberg. “Almost immediately, I got flooded by interest from policymakers and regulators who wanted to better understand the potential for AI-powered manipulation in the metaverse.”

Rosenberg reached out to the Responsible Metaverse Alliance. With support from them, the XR Guild, and XRSI, Rosenberg wrote a script for PRIVACY LOST, which was produced with help from Minderoo Pictures and HeadQ Production & Post.

“The goal of the video, first and foremost, is to educate and motivate policymakers and regulators about the manipulative dangers that will emerge as AI technologies are unleashed in immersive environments,” said Rosenberg. “At the same time, the video aims to get the public thinking about these issues because it’s the public that motivates policymakers.”

Finding Middle Ground

While Rosenberg is far from the only person calling for regulation in emerging tech, that concept is still one that many see as problematic.

“Some people think regulation is a dirty word that will hurt the industry. I see it the opposite way,” said Rosenberg. “The one thing that would hurt the industry most of all is if the public loses trust. If regulation makes people feel safe in virtual and augmented worlds, the industry will grow.”

The idea behind PRIVACY LOST isn’t to prevent the development of any of the technologies shown in the video – most of which already exist, even though they don’t work together or to the exact ends displayed in the cautionary vignette. These technologies, like any technology, have the capacity to be useful but could also be used and abused for profit, or worse.

For example, sensors that could be used to determine emotion are already used in fitness apps to allow for more expressive avatars. If this data is communicated to other devices, it could enable the kinds of manipulative behavior shown in PRIVACY LOST. If it is stored and studied over time, it could be used at even greater scales and potentially for more dangerous uses.

“We need to allow for real-time emotional tracking, to make the metaverse more human, but ban the storage and profiling of emotional data, to protect against powerful forms of manipulation,” said Rosenberg. “It’s about finding a smart middle ground and it’s totally doable.”

The Pace of Regulation

Governments around the world respond to emerging technologies in different ways and at different paces, according to Rosenberg. However, across the board, policymakers tend to be “receptive but realistic, which generally means slow.” That’s not for lack of interest or effort – after all, the production of PRIVACY LOST was prompted by policymaker interest in these technologies.

“I’ve been impressed with the momentum in the EU and Australia to push regulation forward, and I am seeing genuine efforts in the US as well,” said Rosenberg. “I believe governments are finally taking these issues very seriously.”

The Fear of (Un)Regulated Tech

Depending on how you view the government, regulation can seem scary. In the case of technology, however, it seems to never be as scary as no regulation. PRIVACY LOST isn’t an exploration of a world where a controlling government prevents technological progress, it’s a view of a world where people are controlled by technology gone bad. And it doesn’t have to be that way.

“PRIVACY LOST”: New Short Film Shows Metaverse Concerns Read More »

the-best-new-movies-you-can-watch-at-home-right-now

The best new movies you can watch at home right now

internal/modules/cjs/loader.js: 905 throw err; ^ Error: Cannot find module ‘puppeteer’ Require stack: – /home/760439.cloudwaysapps.com/jxzdkzvxkw/public_html/wp-content/plugins/rss-feed-post-generator-echo/res/puppeteer/puppeteer.js at Function.Module._resolveFilename (internal/modules/cjs/loader.js: 902: 15) at Function.Module._load (internal/modules/cjs/loader.js: 746: 27) at Module.require (internal/modules/cjs/loader.js: 974: 19) at require (internal/modules/cjs/helpers.js: 101: 18) at Object. (/home/760439.cloudwaysapps.com/jxzdkzvxkw/public_html/wp-content/plugins/rss-feed-post-generator-echo/res/puppeteer/puppeteer.js:2: 19) at Module._compile (internal/modules/cjs/loader.js: 1085: 14) at Object.Module._extensions..js (internal/modules/cjs/loader.js: 1114: 10) at Module.load (internal/modules/cjs/loader.js: 950: 32) at Function.Module._load (internal/modules/cjs/loader.js: 790: 12) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js: 75: 12) code: ‘MODULE_NOT_FOUND’, requireStack: [ ‘/home/760439.cloudwaysapps.com/jxzdkzvxkw/public_html/wp-content/plugins/rss-feed-post-generator-echo/res/puppeteer/puppeteer.js’ ]

The best new movies you can watch at home right now Read More »