01/29/2025 | Press release | Distributed by Public on 01/29/2025 04:25
The International Federation of Journalists (IFJ) together with journalists' unions around the world has issued a series of recommendations calling for action on artificial intelligence (AI).
Credit: NICOLAS TUCAT / AFP
Journalists are experiencing the initial tremors of a coming artificial intelligence earthquake. It is reshaping our industry more profoundly than the digital revolution of recent decades. Journalists have always been journalism's ablest defenders. When we act collectively we do this best - through our trades unions, able to consider issues in a democracy of practitioners, and thereby engage with governments and employers as necessary.
'Artificial intelligence' describes a broad range of processes that have the capacity to impact all workers. The consequences of this technology for journalists in particular will be profound. Journalists have a deep personal responsibility to ensure that their work is wholly ethical and complies with the IFJ's Global Charter for Ethics for Journalists.
Unions occupy a critical and unique position to facilitate the harnessing, economic framing and regulation of this emergent potential. They can ensure that AI is not used unless it serves the creation of dispassionate, ethically-produced news, for the benefit of humanity and consistent with IFJ's Global Charter for Ethics for Journalists. Journalists have a duty to respect facts and fight for the public's right to truth.
JOURNALISTS AS DEFENDERS OF JOURNALISM
AI has the capacity to significantly reduce the working hours necessary to produce news, potentially reducing tedious or mundane tasks of the job. Most newsrooms, however, are already under-resourced. The strongest newsrooms benefit from a diversity of perspectives, for both accuracy and enrichment. Time saved from new technology must be redeployed to support the work that humans excel at, telling more stories and building community.
AI cannot replace human journalists, and its output must not be considered 'journalism', save where it has been subject to appropriate human oversight and checking.
In many instances it is valuable for those producing news to spend their working lives in close proximity to those upon whom they are reporting. AI must not be used to allow media workers to become any more remote from the communities whose stories deserve to be told.
Journalists must be the standard-bearers for quality expression, accuracy and information presentation. No reduction in accuracy is acceptable.
Generative AI language models contain no information about "truth", only about what occurs frequently in the input material. Such models cannot engage in fact-checking or assess the weight and credibility of sources. They cannot seek out new sources or balanced perspectives. There are many examples of such models producing material that is wholly inaccurate.
News platforms have a record of adopting new technology to reduce costs instead of improving their products. Journalists must commit themselves to finding ways to deploy AI to make their news more accurate, complete, compelling and relevant.
AI tools are not available in many parts of the world - indeed, they are available in just a handful of the world's 7,000 languages. Journalists have a responsibility to highlight this and pressure tech platforms to make provision universal.
Generative AI outputs are biased by their training material. All news platforms must adopt systematic methods to ensure that such biases are not reflected in news stories.
International regulation of AI is required. Where this is considered, journalists must be represented, either by their own unions or by the IFJ. Many initiatives are in progress to this end - the EU's AI Act, a US Executive Order, the Bletchly Declaration and the Hiroshima Process. All are well intentioned, but none has yet created effective guardrails.
RESPONSIBILITIES AS CUSTODIANS OF INFORMATION PRODUCTION
AI has the capacity to create falsehoods so compelling and prolific they are sufficient to overwhelm our entire information ecosystem. Videos and photographs can be created in seconds appearing to prove that fabricated events occurred, and compendious 'supporting' articles can be created. Unions must seek agreements with news platforms to ensure that any published or broadcast works that are wholly or largely the product of AI, are clearly labelled.
All published works that purport to be journalism must be the ultimate responsibility of a suitably qualified and/or experienced journalist, who should use their best professional endeavours to ensure that all works are appropriately credited. This must be done in a manner consistent with theIFJ's Global Charter for Ethics For Journalists. To support this, journalists' rights to be identified as authors of their works need to be strengthened, respected and enforced.
The means by which AI generates its output - the machine consumption, absorption and regurgitation of material created by others - risks undermining the economic benefits that creators are entitled to enjoy from their work. Licensing agreements between news and AI companies must receive consent from journalists and compensate them for their work.
All journalists - including staff journalists, freelancers and independent and self-published journalists - must be entitled to organise and to bargain collectively with AI companies with respect to the terms for use of their work in generative AI language models and outputs.
Those terms must include compensation for ingestion and use that has already occurred, and a fair share of this compensation must be paid to actual journalists, whether they are working as staff, as freelances or under 'work-for-hire' terms.
JOURNALISTS' RIGHTS AS WORKERS
All processes involving workers as employees, freelancers or contractors that deploy AI must be transparent. Whatever algorithms and processes are applied must be open to inspection. Any worker who has been assessed, judged, or evaluated by a process relying upon AI should be able to request a review of that process by a human.
AI has the capacity to reduce contact between workers, both in ordinary daily interaction, and in formal processes such as team meetings, interviews, and appraisals. Valuable efficiencies may arise. However the benefits of human contact must be intrinsic to all workplace planning and should be a requirement upon all employers. Where unplanned, informal contact declines, planned social encounters should be a requirement of work organisation. Journalists' wishes to work remotely should also be respected.
Training to use new technology must be available to all workers.
Where AI is used in recruitment, evaluation, or assessment, workers must be consulted about the process, informed of how AI is deployed, and their consent required before deployment. This process must involve workers' elected representatives or trades unions, as well as individuals. Human decision making must be retained to protect against biased systems.
A FAST, FLEXIBLE RESPONSE
The best regulation of the ownership of the value created by journalists comes from agreements between news platforms and journalists' trades unions. Such agreements have the advantage of speed, precise definition, and shared objectives. As such, they should be encouraged as the easiest and quickest means to regulate AI in news production.
AI development and application is currently unregulated. Given the power of the systems that are developing, wholly in the power of corporations, the dangers from this are immense. For this reason, AI must be brought under robust international regulation. Only an international response is sufficient, as this technology and its products know no boundaries.
Where unions are negotiating for freelance contributors, ingestion for the purpose of AI "training" must be on the basis of consent and compensation.