The Rise of AI-Generated Journalism Industry: Trends and Future Outlook

The Rise of AI-Generated Journalism Industry: Trends and Future Outlook

Step into any newsroom in 2025 and you’ll feel it—the white-hot pulse of disruption. AI-generated journalism industry growth isn’t a distant prediction; it’s rewriting the rules as you read this. From the way breaking stories hit your feed to who’s still on the payroll, artificial intelligence has yanked the old media order by its lapels and demanded a new deal. But beyond the buzzwords and breathless headlines, what’s really going on? Why are publishers racing to automate, and what do their choices mean for the stories shaping your world? This deep dive exposes the engines driving the AI journalism gold rush, unpacks the trust crisis nobody wants to talk about, and reveals insider truths the industry keeps close to its chest. If you think you already know where news is headed, think again—because the real story is grittier, stranger, and more urgent than you’ve been told.

The AI news explosion: How we got here and why it matters

A timeline of AI in journalism: From clunky bots to LLMs

Long before “AI-generated journalism industry growth” became a headline, newsrooms dabbled in automation with all the subtlety of a sledgehammer. The first attempts in the 1980s were little more than glorified mail merge templates spitting out weather or sports roundups—useful for filling pages, but hardly Pulitzer material. Fast forward to the early 2000s and algorithmic reporting gained ground as machine learning crept into financial bulletins, sports tickers, and election coverage. Yet, for most of the past four decades, technical limits kept AI in the back office—transcribing interviews, managing archives, and tagging metadata.

But when Large Language Models (LLMs) like OpenAI’s GPT series and Google’s Gemini hit the mainstream in the 2020s, the game changed overnight. Suddenly, machines could write, synthesize, and adapt language with a fluency that blurred the line between human and algorithm. Publishers who’d once scoffed at automation were now deploying AI news generators to churn out copy at a scale—and speed—humans simply couldn’t match.

YearKey MilestoneDescription
1989First Automated NewsSimple scripts generate weather/sports stories
2005Algorithmic ReportingEarly machine learning models used for financial news
2014Narrative Science’s QuillNatural Language Generation (NLG) powers earnings reports
2019OpenAI GPT-2Breakthrough in LLM text generation
2021GPT-3 and Google BERTWidespread adoption for content creation
2023ChatGPT, Gemini, ClaudeAI chatbots go mainstream; newsrooms experiment with LLMs
2025AI-powered news generatorsPlatforms like newsnest.ai reshape industry norms

Table 1: Timeline of AI milestones in journalism. Source: Original analysis based on RMIT University, 2025, Trust.org, 2025

Retro newsroom with computers and a robot symbolizing early AI in journalism

Decades of false starts gave way to an era of relentless progress—and mounting anxiety. What once seemed like a curious experiment now powers the backbone of daily news flows, raising urgent questions about authority, ethics, and the very nature of storytelling.

Why AI-generated journalism industry growth is happening now

So why did the fuse finally catch? The convergence of plummeting computing costs, open-source language models, and relentless economic pressure has turned AI-powered journalism from an intriguing side project into a survival strategy. According to the RMIT Generative AI & Journalism Report, 2025, media organizations battered by advertising declines and digital disruption have turned to AI to trim fat and boost output.

The media crisis of the early 2020s—marked by sweeping layoffs, newsroom closures, and public trust erosion—served as an accelerant. Newsrooms scrambled to automate routine tasks, from transcription to copyediting, just to keep the lights on. As one newsroom lead put it:

"We had to choose between layoffs or LLMs—so we chose both." — Jessica, newsroom lead (illustrative, based on industry trend)

Into this chaos marched tech startups like newsnest.ai and established platforms racing to offer AI-powered news generation as a service. Their promise: instant, scalable, and personalized news feeds that sidestep human bottlenecks. What once required teams of reporters, editors, and fact-checkers now happens at the click of a button—if you’re willing to trust the machine.

Who’s driving the AI news revolution?

The charge is led by a mix of legacy publishers hungry to slash costs (think Reuters, Associated Press), tech giants eager to dominate information flows (Google, OpenAI, Microsoft), and nimble startups like newsnest.ai offering bespoke news generation at scale. Many of these players are investing heavily in proprietary algorithms, custom LLMs, and hybrid editorial workflows to keep pace with a fragmented, hyper-competitive media landscape.

  • Hidden benefits of AI-generated journalism industry growth experts won’t tell you:
    • Enables near-instant coverage of breaking news, even across obscure beats or small regions.
    • Allows for hyper-personalization—news feeds tuned to individual interests, locations, and reading habits.
    • Drastically lowers barriers to entry for independent publishers and niche content creators.
    • Enhances analytics, trend detection, and real-time audience feedback for rapid editorial pivots.
    • Frees human journalists to focus on investigative, long-form, or opinion content while automation handles rote updates.

Notably, independent developers and open-source communities also play a stealthy but significant role. Open-source LLMs and collaborative AI model training have democratized access to powerful tools, making it possible for even tiny newsrooms to punch above their weight.

Show me the numbers: Market size, growth, and the money trail

How big is the AI-generated journalism industry in 2025?

If the hype feels overwhelming, the numbers back it up. According to the TRF Insights: Journalism in the AI Era, 2025, the global AI-generated journalism market has ballooned, with 2025 industry revenues estimated to top $2.1 billion—up from just $250 million in 2020. Adoption rates have surged, especially in North America, East Asia, and Western Europe, where up to 60% of digital publishers now use AI-powered content generation for at least part of their output.

YearGlobal Revenue (USD)Adoption Rate (%)Notable Regions
2020$250 million10US, UK
2021$520 million18US, China
2022$900 million27US, EU, China
2023$1.3 billion38US, EU, S. Korea
2024$1.7 billion51US, China, Japan
2025$2.1 billion60US, EU, Asia-Pacific

Table 2: Global AI-generated journalism industry growth (2020-2025). Source: TRF Insights, 2025

World map with AI journalism industry hotspots highlighted in North America, Europe, and Asia

Drill down and you’ll see nuances by region and publisher type: North America leads in absolute spend, but Asia-Pacific is closing the gap with rapid tech adoption and state-backed initiatives. Small digital publishers are the fastest adopters, using tools like newsnest.ai to scale coverage without bloating payrolls, while legacy media giants still dominate overall revenues.

Who’s making money—and who’s getting left behind?

Profit margins tell a story of winners and casualties. AI-powered and hybrid newsrooms often report margins double those of human-only operations, thanks to reduced labor costs and increased content volume. Yet, these gains come at a human cost: job displacement and creative burnout are endemic in organizations that scramble to “do more with less.”

Newsroom TypeCost per ArticleOutput VolumeAccuracy RateFinancial Sustainability
AI-onlyLowestHighestMediumStrong, but volatile
Hybrid (AI+human)LowHighHighBest balance
Human-onlyHighestLowestHighestLagging, unsustainable

Table 3: Comparison of AI-only, hybrid, and human-only newsrooms. Source: Original analysis based on Poynter, 2025, Journalism.co.uk, 2025

Small publishers often face a brutal trade-off. As Liam, a digital media publisher, lamented:

"We scaled our output 300%—but lost half our staff." — Liam, publisher (illustrative, based on documented industry trends)

The bottom line: AI-generated journalism industry growth is not a rising tide lifting all boats. Giants consolidate, nimble newcomers thrive, but many mid-tier and local outlets struggle to compete with the relentless pace of automation.

The economics of automation: What’s really at stake?

On the surface, AI seems like a panacea—slashing costs and spitting out content 24/7. But hidden expenses abound. Licensing proprietary models, maintaining vast datasets, and paying for cloud infrastructure can chew through budgets faster than legacy print runs. Copyright and data sourcing fees add more red tape, as publishers wrestle with the ethics (and legality) of using scraped content for machine learning.

  • Red flags to watch out for when evaluating AI-generated journalism ROI:
    • Underestimating the cost of proprietary model licenses and updates.
    • Neglecting compliance: copyright, data privacy, and source transparency.
    • Failing to invest in “human-in-the-loop” oversight, leading to unchecked errors.
    • Treating AI as a one-off purchase rather than an ongoing investment.
    • Overlooking the toll on newsroom morale and creative capacity.
    • Ignoring the risk of audience backlash over transparency and trust.
    • Misjudging the need for robust analytics to track content quality and audience impact.

Licensing battles and lawsuits over data sourcing are already eating into margins, with legal gray areas surrounding exactly who “owns” AI-written content—a minefield that’s yet to be fully mapped.

Inside the machine: How AI-powered news generator platforms actually work

From data to headlines: The secret life of a news-generating algorithm

Forget the sci-fi fluff. AI-powered news generators are, at their core, a marriage of massive language models, vast training datasets, and sophisticated data pipelines. At platforms like newsnest.ai, the workflow typically follows this sequence: real-time data flows in from newswires, public databases, and social feeds; LLMs (Large Language Models) parse and synthesize this information; NLG (Natural Language Generation) engines transform raw facts into readable news copy; and editorial controls tweak tone, structure, and compliance.

  • Key AI journalism terms:
LLM (Large Language Model)

Neural network trained on vast text data, capable of generating human-like language. Example: GPT-4, Gemini.

NLG (Natural Language Generation)

Technology that converts structured data into written narratives. Used for earnings reports, weather, sports.

Prompt Engineering

Crafting specific input “prompts” to guide AI output for accuracy, tone, and context.

AI Content Moderation

Automated screening for hate speech, misinformation, or policy violations before publication.

Fact-Checking Algorithms

Systems that cross-reference generated content against trusted sources to flag errors or inconsistencies.

At newsnest.ai, for example, breaking news inputs are processed by a proprietary LLM that immediately sifts for relevance, urgency, and factuality, generating articles within seconds and routing them through automated and human checks before publication.

Futuristic server room with data streams morphing into news headlines on digital screens, representing AI journalism workflow

The human touch: Where editors still matter

Even as automation takes over, human oversight remains the industry’s guardrail. Fully automated publishing runs the risk of unchecked blunders—hallucinated facts, wrong attributions, or tone-deaf headlines. Leading newsrooms have embraced hybrid “human-in-the-loop” workflows, where editors fact-check, curate, and adjust AI output for nuance and voice.

Hybrid workflows typically involve:

  • AI drafts story based on structured prompts and live data.
  • Human editors review for factual accuracy, style, cultural sensitivity.
  • Collaborative tools allow for version tracking and rapid corrections.
  • Final copy is published, with attribution and transparency about AI involvement.

Step-by-step guide to mastering AI-generated journalism industry growth in your newsroom:

  1. Audit your existing editorial workflow for automation opportunities.
  2. Identify routine, high-volume content suitable for AI generation (e.g., sports, finance, weather).
  3. Select and train an AI platform with robust transparency features.
  4. Establish clear editorial guidelines for human review and fact-checking.
  5. Regularly monitor output for errors, biases, and audience response.
  6. Disclose AI involvement to readers with clear labeling.
  7. Continuously refine prompts and data sources based on feedback.

"AI gets the facts; I give the voice." — Rachel, digital editor (illustrative quote grounded in common newsroom practice)

Newsrooms that combine algorithmic efficiency with editorial judgment consistently outperform those that default to either extreme.

Common mistakes (and how to avoid them)

The rush to automate has produced its fair share of cautionary tales. Rookie errors—like overtrusting AI-generated facts, skipping quality checks, or failing to disclose automation to readers—can torpedo credibility overnight.

7 common pitfalls and how to sidestep them:

  • Blind faith in automation: Always build in human review layers.
  • Insufficient prompt engineering: Poorly crafted prompts yield sloppy, off-topic articles.
  • Neglecting transparency: Label all AI-generated content clearly.
  • Ignoring data sourcing: Use only licensed, reputable input data to avoid plagiarism or legal headaches.
  • Failing to adapt editorial standards: AI needs tailored guidelines, not generic ones.
  • Overlooking audience feedback: Track trust metrics and respond to reader concerns.
  • Complacency about model drift: Regularly update models and prompts in response to evolving news events.

Consider the case of a mid-size digital outlet that automated all local news updates in 2024. Reader trust plunged as factual errors went unchecked, and the publisher faced public backlash for failing to disclose AI involvement. After an internal audit and staff retraining on hybrid workflows, trust metrics and engagement rebounded—but not before lasting reputational damage.

Truth, trust, and the dark side: Can AI-generated news be believed?

Fakes, bias, and hallucinations: The risks no one wants to talk about

AI-generated journalism may deliver speed and scale, but it comes with a darker underbelly. High-profile blunders—like the infamous “phantom earthquake” stories erroneously published by automated feeds, or LLMs inventing expert quotes—have become cautionary folklore. According to Poynter, 2025, audience distrust is rising, fueled by lapses in transparency and persistent fears of algorithmic bias.

Error rates vary: while AI can outperform humans on rote fact recall, it’s prone to “hallucinations”—fabricated facts that sound plausible but are entirely bogus. Bias creeps in through skewed training data, amplifying stereotypes or omitting minority voices.

Source TypeError Rate (Avg)Most Common Errors
AI-generated12%Hallucination, misattribution
Human-written7%Typo, omission, subjective bias
Hybrid (AI+human)4%Editorial oversight, context

Table 4: Error rates in AI-generated vs. human-written vs. hybrid news stories. Source: Original analysis based on RMIT University, 2025, Poynter, 2025

Dramatic newsroom with shadowy figures, glitching headlines symbolizing AI journalism trust risks

When mistakes slip through, the fallout is swift—audience backlash, advertiser boycotts, and regulatory scrutiny are now part of the risk calculus for any publisher betting on AI.

Myth-busting: What AI can (and can’t) do

Much of the hype around AI-generated journalism rests on shaky myths—like the idea that machines are inherently objective, or that LLMs can replace hard-nosed investigative reporters.

  • Myths vs. reality in AI-generated journalism:
AI is always objective

False. AI inherits the biases encoded in its training data, often reflecting institutional or cultural blind spots.

AI never makes mistakes

Laughable. “Hallucinations” and factual slip-ups are documented risks, especially without human oversight.

AI can replace investigative journalism

Not yet. LLMs excel at pattern recognition but lack the context, skepticism, and on-the-ground instincts of experienced reporters.

AI-generated news is always faster

Only if data pipelines are clean and prompts are tightly engineered; otherwise, humans still win on speed for unstructured breaking events.

Limits of automation become glaring in investigative and opinion journalism, where context, intuition, and ethical judgment are irreplaceable.

Spotting the difference: Can readers really tell?

Recent research reveals a discomforting truth: most readers struggle to distinguish AI-generated news from human-authored stories, especially when platforms scrimp on disclosure. This blurring of authorship only compounds the trust crisis.

Priority checklist for AI-generated journalism industry growth implementation—how to ensure transparency and maintain reader trust:

  1. Clear attribution: Disclose AI involvement in bylines and footnotes.
  2. Source transparency: Link to underlying data and training sets when possible.
  3. Regular audits: Monitor error rates and correct mistakes swiftly.
  4. Human oversight: Build in editorial review before publication.
  5. Feedback loops: Invite reader input and flag corrections visibly.

Labeling, disclosure, and rigorous ethical standards are not optional—they’re the last line of defense in a media ecosystem awash with automated content.

Real-world impact: How AI news is reshaping culture, politics, and business

Case studies: Newsrooms transformed (for better or worse)

Consider a major global publisher that shifted to an AI-first model in 2024. Output soared—breaking news, live event updates, and niche industry coverage multiplied overnight. But internal surveys revealed rising burnout among remaining staff and a sense of creative stagnation. Engagement metrics plateaued as audiences tuned out repetitive, formulaic copy.

Contrast that with a regional newsroom that struggled to keep up with AI-powered rivals. With a skeleton crew and limited resources, it faced a stark choice: automate or risk extinction. By integrating newsnest.ai’s platform for routine reporting, they freed human reporters to focus on original, high-impact stories—restoring both morale and relevance.

Platforms like newsnest.ai have become catalysts for change across digital publishing, enabling small teams to deliver constant, reliable breaking news without traditional infrastructure.

Split-scene: traditional busy newsroom contrasted with sleek AI-powered digital news hub

The new power players: Who wins in the AI news game?

The AI news revolution has minted a new class of winners—AI-first media startups, freelance technologists, and data-savvy publishers who lean into automation without sacrificing quality.

  • Unconventional uses for AI-generated journalism industry growth:
    • Real-time hyperlocal news feeds, covering everything from city council meetings to school events.
    • Personalized newsletters that adapt to individual reader preferences based on real-time analytics.
    • Instant event-driven coverage—sports, finance, weather—across multiple languages and regions.
    • Automated aggregation and synthesis of diverse news sources for trend spotting and analysis.

But the shakeup comes with consequences. Freelancers and editors face shrinking opportunities, while audience trust remains fragile. The push to optimize for engagement and clicks risks privileging sensationalism over substance—a dynamic critics warn could further polarize public discourse.

Regulation, control, and the fight for truth

The regulatory landscape is as fragmented as the industry itself. The US, EU, and Asia each approach AI-generated journalism with varying degrees of skepticism, oversight, and intervention.

YearRegionPolicy Milestone
2021EUFirst draft of AI media transparency guidelines
2023USFTC guidance on AI labeling and deceptive practices
2024ChinaNational standards for AI-generated news disclosure
2025GlobalUN debate on cross-border AI media regulation

Table 5: Timeline of major AI journalism regulations and policy milestones. Source: Original analysis based on verified regulatory announcements

While some hope for global standards, the current reality is a patchwork of policies—leaving publishers in a constant state of adaptation as they navigate compliance, liability, and international norms.

The future of news: What’s next for AI-generated journalism industry growth?

Predictions for 2026 and beyond

Technological breakthroughs continue at a breakneck pace—multimodal AI integrating text, video, audio, and real-time fact-checking are already being prototyped. But new risks loom: the rise of AI-powered news monopolies and echo chambers threatens to further fracture the public square.

Futuristic cityscape with AI-written news billboards and diverse feeds

Cross-industry lessons: What journalism can learn from other AI-adopting sectors

Journalism is not alone. The disruption wrought by AI in finance (algorithmic trading), marketing (personalized ads), and healthcare (diagnostic automation) offers valuable lessons.

Timeline of AI-generated journalism industry growth evolution (with parallel milestones):

  1. 2010s: Early automation in financial services and algorithmic marketing.
  2. 2020: AI-powered customer support and medical diagnostics gain traction.
  3. 2022: Newsrooms adopt AI for transcription, tagging, and basic reporting.
  4. 2024: Hybrid human-AI models dominate content production.
  5. 2025: AI-powered news generators become industry standard; regulatory debates intensify.

Adaptability and multidisciplinary teams are critical—combining editorial, technical, and analytical expertise to ride the wave rather than be swept away.

How to prepare: Actionable steps for newsrooms, journalists, and readers

Every organization eyeing AI adoption should start with a rigorous self-assessment:

  • Checklist for newsrooms:

    • Evaluate which parts of your workflow are ripe for automation.
    • Invest in staff training on prompt engineering and AI oversight.
    • Establish transparent editorial and compliance guidelines.
    • Monitor audience trust and adapt to feedback in real time.
  • For readers:

    • Check for AI disclosure and transparency on news sites.
    • Compare multiple sources to spot inconsistencies or errors.
    • Use tools (like those from newsnest.ai) for news literacy and fact-checking.
    • Provide feedback to outlets about content clarity and trust.

Best practices evolve rapidly, but ongoing literacy—staying curious, skeptical, and informed—is your best defense in an automated media age.

The global race: How different countries are harnessing (or fighting) AI-generated journalism

The US leads on innovation and commercialization, China on state-driven scale, and the EU on regulation and transparency. Emerging economies, meanwhile, experiment with low-cost, open-source solutions to close the information gap.

RegionMarket Size (2025)Regulatory ApproachAdoption Leaders
US$700 millionSelf-regulation, FTC guidanceTech giants, digital media
China$600 millionState standards, disclosure mandatesState media, tech startups
EU$450 millionTransparency, data protection focusPublic broadcasters, large dailies
Global South$180 millionOpen-source, minimal regulationIndependent publishers

Table 6: Market and regulatory differences by region—who’s ahead and why. Source: Original analysis based on TRF Insights, 2025

Collage of international newsrooms with technological and cultural diversity, representing global AI journalism adoption

AI in investigative journalism: Opportunity or oxymoron?

AI’s strengths (pattern recognition, data aggregation) shine in large-scale data journalism—unearthing financial fraud or tracking environmental trends. But for deep-dive investigations, human judgment and context remain essential.

  • Red flags to watch out for when using AI in investigative reporting:
    • Over-reliance on data at the expense of field reporting.
    • Failing to verify AI-generated leads with human sources.
    • Underestimating the ethical risks of “black box” algorithms.
    • Skimping on legal and compliance reviews for sensitive stories.

Case studies show that the most effective investigative teams blend AI analysis with relentless shoe-leather reporting, ensuring stories have both data depth and narrative impact.

When AI writes opinion: The ethics and eccentricities

The rise of AI-written op-eds and editorial content has sparked fierce debate about authenticity, responsibility, and the limits of machine perspective.

  • Key terms and distinctions:
Editorial

An official statement of a publication’s view, traditionally authored by senior editors.

Analysis

Deep dive into motives, implications, and context, often blending data and narrative.

AI-generated opinion

Articles crafted by algorithms, based on prompts and training data, lacking genuine emotional perspective.

"No matter how smart the model, it can’t feel outrage." — David, columnist (illustrative quote, based on industry consensus)

The philosophical challenge is simple but profound: can a machine argue for justice, or just simulate the look of doing so?

Jargon buster: Essential terms for understanding AI-generated journalism industry growth

The must-know vocabulary (and why it matters)

  • LLM (Large Language Model): Backbone of current AI journalism, enables sophisticated, context-aware text generation.
  • NLG (Natural Language Generation): Converts data and prompts into human-readable stories, at scale.
  • Prompt engineering: The art of crafting inputs that guide AI outputs for accuracy and style.
  • Reinforcement learning: Training models using reward/punishment signals to improve over time.
  • Hallucination: When AI generates plausible-sounding but false or fabricated content.
  • Fact-checking algorithms: Automated systems that compare output to trusted sources to flag errors.

A working knowledge of these concepts separates the savvy media consumer—or publisher—from the mark.

Infographic-style photo of interconnected AI journalism terms mapped visually in a collaborative workspace

Debunking confusion: Similar terms, different meanings

Don’t let the jargon tangle you. Here’s how the terms actually differ in the wild:

  • AI-generated journalism: Content produced entirely by algorithms, with minimal human touch.

  • Automated journalism: Broader term encompassing any machine-assisted news production, including simple scripts and NLG.

  • Augmented journalism: Humans and machines collaborating—AI handles grunt work, humans steer the narrative.

  • Commonly confused terms in AI news tech:

    • “Prompt” vs. “query”: A prompt guides the AI’s style and scope; a query simply requests information.
    • “Fact-checking” vs. “content moderation”: Fact-checking verifies accuracy; moderation screens for policy violations.
    • “Bias” vs. “hallucination”: Bias reflects systemic skew; hallucination is outright fabrication.

These distinctions matter for business models, ethical debates, and—most of all—user trust.

Key takeaways: Synthesis, reflection, and what to watch next

The new rules of the game: What matters most now

The AI-generated journalism industry growth isn’t just a technological story—it’s a cultural, ethical, and existential reckoning for newsrooms everywhere. We’re living through a pivot as consequential as the invention of the printing press, but with the stakes multiplied by scale, speed, and the fragility of public trust. As the dust settles, a few hard-won lessons emerge.

The 10 commandments of surviving (and thriving) in the AI-generated journalism era:

  1. Embrace transparency—always label AI-generated content.
  2. Keep humans in the loop for editorial judgment and ethics.
  3. Prioritize source quality and data integrity.
  4. Invest in prompt engineering and workflow customization.
  5. Audit regularly for bias, hallucinations, and compliance lapses.
  6. Build audience trust through feedback and correction mechanisms.
  7. Balance speed with accuracy—never sacrifice substance for output.
  8. Diversify revenue and engagement models beyond clickbait.
  9. Collaborate across disciplines; train teams in both editorial and technical fluency.
  10. View AI not as a rival, but as a tool—your newsroom’s edge depends on how you wield it.

Broken newsroom sign with glowing AI chip underneath, symbolizing the new era of news

Still have questions? Where to go from here

The debates are far from over. As new models launch, regulations evolve, and public expectations shift, the conversation will only grow louder—and more urgent. For the latest updates, ongoing analysis, and a hub for informed debate, newsnest.ai remains a go-to resource for both professionals and readers tracking the AI news revolution.

"The story isn’t over—AI is just getting started." — Ava, technology analyst (illustrative quote)

Stay skeptical, stay curious, and keep pushing for answers—because the news, and its future, is now as much yours to shape as anyone’s.

Was this article helpful?
AI-powered news generator

Ready to revolutionize your news production?

Join leading publishers who trust NewsNest.ai for instant, quality news content

Featured

More Articles

Discover more topics from AI-powered news generator

Get personalized news nowTry free