Showing posts with label propaganda. Show all posts
Showing posts with label propaganda. Show all posts

Friday, 29 November 2024

BIRCHGROVE LEGAL TAKES OPPOSITION LEADER & LIBERAL MP FOR DICKSON TO THE HUMAN RIGHTS COMMISSION UNDER AUSTRALIA’S RACIAL DISCRIMINATION ACT: "Mr Dutton’s pattern of spreading disinformation to justify the demonisation and oppression of a people facing plausible genocide is not only in poor taste, but a violation of human rights"

 

Birchgrove Legal is a law firm based in Australia, specialising in servicing the needs of organisations from not-for-profits, small & medium enterprises to multi-national corporations.


On 26 November 2024 the law firm released the following media release:


Opposition Leader Peter Dutton Faces Group Legal action over Gaza Remarks and Incitement of Hate


25 November 2024


A representative legal action has been lodged against Opposition Leader Peter Dutton under Australia’s Racial Discrimination Act, accusing him of discrimination and inciting racial hatred through his remarks and social media commentary on Gaza and Palestinians.


The action, filed with the Australian Human Rights Commission by Birchgrove Legal, represents Jewish, Palestinian, and Muslim communities in Australia.


Led by Professor Peter Slezak, an Australian Jewish academic and Palestinian advocate Nasser Mashni, the action accuses Dutton of dehumanising Palestinians, Muslims, and Jews, while stigmatising Australians who support Palestinian rights.


The legal action states that Mr Dutton’s comments contradict Australia’s obligations under the Genocide Convention and the Rome Statute, particularly in relation to preventing genocide and protecting refugees.


Additionally, the legal action states that Mr Dutton’s public comments have led to increased vilification of Palestinians, including targeted harassment and hate crimes against peaceful protesters intimidating Jewish and other Australians supporting the Palestinian rights movement.


Principal Solicitor at Birchgrove Legal, Moustafa Kheir, said Mr Dutton’s words had normalised anti-Palestinian hate and dehumanising rhetoric.


Mr Dutton’s pattern of spreading disinformation to justify the demonisation and oppression of a people facing plausible genocide is not only in poor taste, but a violation of human rights,” Mr Kheir said.


This legal action seeks to ensure that political leaders are held accountable for their words and actions, and that we are all prescribed to the same judicial system despite our cultural background, privilege or faith.”


Among the 22 incidents, the key allegations against Mr Dutton include:


1. Misleading Claims About Palestinian Nakba survivors seeking refuge in Australia: In August 2024, Dutton claimed the Australian Government was jeopardising national security by granting almost 3,000 tourist visas to people from Gaza, which he labelled a “terrorist-controlled” zone. He also shared a misleading graph that sparked anti-Palestinian sentiment. Dutton’s insinuations that the Albanese Government’s actions were politically motivated to appease Muslim voters were reflected in hostile public responses.


2. False Claims and Propaganda: Dutton is accused of amplifying discredited far-right claims, including false Israeli propaganda about beheaded babies, and repeating debunked stories about Australian protesters allegedly shouting, “gas the Jews.” NSW Police had dismissed the latter claims, but Dutton failed to retract or apologise for spreading them.


3. Encouraging Violence and Deportation: The legal action highlights Dutton’s calls for “no restraint” in Israel’s military actions against civilians in Gaza and deporting pro-Palestinian protesters from Australia.


4. Disparaging Muslim Candidates: Dutton’s comments about Muslim candidates in federal parliament being a “disaster.”


5. Atrocity Denial: Dutton is accused of engaging in ‘atrocity denial’ by failing to acknowledge Israel’s disproportionate killing of civilians and unlawful occupation of Palestinian territory.


Professor Slezak said it was abhorrent for a national leader to engage in such divisive public commentary fully understating the racial tensions it could breed.


Mr Dutton is using the same ‘security threat’ language against Palestinians that was once used to demonise Jewish people before the Holocaust—and worse, he claims to do this in our name,” Mr Slezak said.


Like many Jewish Australians, I grieve the atrocities Israel is committing against Palestinians and we will not be intimidated into silence.


Statements that dehumanise any group of people, including Palestinians, must be challenged. All parties responsible must be held accountable for statements that dehumanise certain groups and fuel division, including Palestinians, whose suffering deserves recognition.”


Mr Mashni also condemned the harm caused by Mr Dutton’s comments citing the gross dehumanisation of Palestinians and Palestinian Australians undermined international law.


On one side, we have a government refusing to impose sanctions, and on the other, an opposition leader encouraging Australia to flout international law and withhold empathy to the human suffering occurring,” Mr Mashni said.


Our community urgently needs Australia to take a firm stand against Israel’s ongoing genocide, which will only end with sustained international pressure.”


The legal action requests a public apology from Mr Dutton and rectifications and compensation for affected communities.


If the Commission does not resolve the matter, applicants may pursue a Federal Court action on the same grounds. Law suits cannot be brought directly to Court under the Racial Discrimination Act and must start in the Australian Human Rights Commission.


ENDS


Thursday, 20 June 2024

So the Coalition appears to believe that a homegrown nuclear power policy will get it over the line at the 2025 general election? Despite the fact that it will take too long reaching operational status, costs up to 85 billion to build & once all 7 reactors are up and running will need many billions of megalitres of water annually to function

 

On Wednesday, 19 June 2024 the Leader of the Coalition Opposition & Liberal MP for Dickson Peter Dutton (Qld) held a joint press conference with Leader of the National Party & MP for Maranoa David Littleproud (Qld), Liberal MP for Farrer Sussan Ley (NSW), Liberal MP for Hume Angus Taylor (NSW) and Liberal MP for Fairfax Ted O'Brien (Qld).


These representatives of their parties have sat in the Australian Parliament for approximately the last 22, 7, 22,10 and 7 years respectively.


From that Wednesday press conference an est. 10,047 word transcript was produced which alleges to outline a Coalition policy on nuclear power as part of Australia's energy mix, with 7 nuclear power plants to be constructed in the vicinities of Tarong and Callide in Queensland, Blackmans Creek and Mount Piper in NSW, Traralgon in Victoria, Collie in Western Australia and Port Augusta in South Australia.


All 7 of these projected sites according to Dutton & Co are to be compulsorily acquired from the existing owners on behalf of the Commonwealth and it is anticipated that the nuclear build will begin sometime in the next 10 years (before 2035) and the first two nuclear power plants will be complete in the next 11 to 12 years (2035-37) with the remaining five being completed by sometime in the 2040s.


When it comes to the projected cost of the build no estimation is given other than "it will be a big bill, there’s no question about that".


However the CSIRO GenCost 2023-24 report calculates a 7 large-scale power plant build as costing up to $85 billion in today's dollars, with the first nuclear power plant completed at est. cost of up to $17 billion. While a small scale SMR nuclear power plant (as yet commercially unrealised) has a tentative est. build of somewhere between $5.1 and $9.3 billion. A total cost for 7 small scale plants being between est. $ 35.7 to $65.1 billion in today's dollars.


The Australian Energy Council, peak industry body for electricity and downstream natural gas businesses operating in the competitive wholesale and retail energy markets, is not critical of CSIRO's timetables and costings.


The CSIRO GenCost 2023-24 final report also indicates an estimate of total build years to completion for 7 large-scale nuclear power plants as 40.6 years - with a most optimistic completion date in 2064-2065 if construction commenced immediately. While the report also states estimated total build years to completion for 7 small-scale nuclear power plants is 30.6 years - with a most optimistic completion date in 2054-2055 if construction commenced immediately.


In the joint press release transcript it states: "we’ve looked at water" as part of the basis of making the announcement of Coalition intentions to build those seven nuclear power plants if elected to govern in 2025.


However that brief mention did not qualify or quantify nuclear power production water needs, which according to a nuclear power-neutral Smart Water Magazine quote:

one nuclear reactor requires between 1,514L and 2,725L litres of water per MWh. It equates to billions of gallons of water per year, and all this water requires filtering somehow.

This would see Queensland & New South Wales required to each find an additional est. 27,786 megalitres of water per annum and Victoria, South Australia & West Australia each required to find an additional est. 13,893 megalitres per annum.

To put that into some perspective two nuclear power plants operating for one year in NSW would require the equivalent of 557 years of Clarence River average water discharge into the sea.


Further, in the joint press release, this quartet of Coalition politician also appear to be asserting that an Australian nuclear power industry will supply "cheaper" electricity.

The Australian Energy Council states:

Australian retail household electricity prices in the National Electricity Market (NEM) are the lowest they have been for eight years, and on an international comparison are the 10th lowest of the 38 OECD countries. The average cost per unit of electricity has fallen to 27 cents/kWh according to the most recent Australian Competition and Consumer National Electricity Market (NEM) data. When compared against other countries using a purchasing power exchange rate, Australian average prices per kilowatt-hour are equivalent to 17.6 US cents (c/kWh), well below the OECD average cost of 24.2 US c/kWh and less than many European countries.




World Nuclear Association graph, 30.04.24. Click on image to enlarge


According to the World Nuclear Association in 2024 there are nuclear power reactors operating in 32 countries plus Taiwan.


Looking at the graph of 58 countries above, 5 of the 15 countries with the highest household electricity prices were countries with nuclear power in the mix.


The Czech Republic operating 6 nuclear reactors has the second highest household electricity price, Belgium operating 5 nuclear reactors the 7th highest, Spain with 7 nuclear reactors the 10th highest, Slovenia sharing 1 nuclear reactor the 12th highest and the United Kingdom operating 9 nuclear reactors the 14th highest.


The full transcript of the 17 June 2024 joint press release can be read at:

https://www.peterdutton.com.au/leader-of-the-opposition-transcript-joint-press-conference-with-the-hon-david-littleproud-mp-the-hon-sussan-ley-mp-the-hon-angus-taylor-mp-and-mr-ted-obrien-mp-sydney/


Wednesday, 17 April 2024

Discussing Artificial Intelligence AI in April 2024

 

Well this month attention has turned from AI being used to create multiple fake bird species and celebrity images or Microsoft's using excruciatingly garish alternative landscapes to promote its software - the focus has shifted back to AI being used by bad actors in global and domestic political arenas created during election years.


Nature, WORLD VIEW, 9 April 2024:

AI-fuelledelection campaigns are here — where are the rules?

Political candidates are increasingly using AI-generated ‘softfakes’ to boost their campaigns. This raises deep ethical concerns.

By Rumman Chowdhury


Of the nearly two billion people living in countries that are holding elections this year, some have already cast their ballots. Elections held in Indonesia and Pakistan in February, among other countries, offer an early glimpse of what’s in store as artificial intelligence (AI) technologies steadily intrude into the electoral arena. The emerging picture is deeply worrying, and the concerns are much broader than just misinformation or the proliferation of fake news.


As the former director of the Machine Learning, Ethics, Transparency and Accountability (META) team at Twitter (before it became X), I can attest to the massive ongoing efforts to identify and halt election-related disinformation enabled by generative AI (GAI). But uses of AI by politicians and political parties for purposes that are not overtly malicious also raise deep ethical concerns.


GAI is ushering in an era of ‘softfakes’. These are images, videos or audio clips that are doctored to make a political candidate seem more appealing. Whereas deepfakes (digitally altered visual media) and cheap fakes (low-quality altered media) are associated with malicious actors, softfakes are often made by the candidate’s campaign team itself.


How to stop AI deepfakes from sinking society — and science


In Indonesia’s presidential election, for example, winning candidate Prabowo Subianto relied heavily on GAI, creating and promoting cartoonish avatars to rebrand himself as gemoy, which means ‘cute and cuddly’. This AI-powered makeover was part of a broader attempt to appeal to younger voters and displace allegations linking him to human-rights abuses during his stint as a high-ranking army officer. The BBC dubbed him “Indonesia’s ‘cuddly grandpa’ with a bloody past”. Furthermore, clever use of deepfakes, including an AI ‘get out the vote’ virtual resurrection of Indonesia’s deceased former president Suharto by a group backing Subianto, is thought by some to have contributed to his surprising win.


Nighat Dad, the founder of the research and advocacy organization Digital Rights Foundation, based in Lahore, Pakistan, documented how candidates in Bangladesh and Pakistan used GAI in their campaigns, including AI-written articles penned under the candidate’s name. South and southeast Asian elections have been flooded with deepfake videos of candidates speaking in numerous languages, singing nostalgic songs and more — humanizing them in a way that the candidates themselves couldn’t do in reality.


What should be done? Global guidelines might be considered around the appropriate use of GAI in elections, but what should they be? There have already been some attempts. The US Federal Communications Commission, for instance, banned the use of AI-generated voices in phone calls, known as robocalls. Businesses such as Meta have launched watermarks — a label or embedded code added to an image or video — to flag manipulated media.


But these are blunt and often voluntary measures. Rules need to be put in place all along the communications pipeline — from the companies that generate AI content to the social-media platforms that distribute them.


What the EU’s tough AI law means for research and ChatGPT


Content-generation companies should take a closer look at defining how watermarks should be used. Watermarking can be as obvious as a stamp, or as complex as embedded metadata to be picked up by content distributors.


Companies that distribute content should put in place systems and resources to monitor not just misinformation, but also election-destabilizing softfakes that are released through official, candidate-endorsed channels. When candidates don’t adhere to watermarking — none of these practices are yet mandatory — social-media companies can flag and provide appropriate alerts to viewers. Media outlets can and should have clear policies on softfakes. They might, for example, allow a deepfake in which a victory speech is translated to multiple languages, but disallow deepfakes of deceased politicians supporting candidates.


Election regulatory and government bodies should closely examine the rise of companies that are engaging in the development of fake media. Text-to-speech and voice-emulation software from Eleven Labs, an AI company based in New York City, was deployed to generate robocalls that tried to dissuade voters from voting for US President Joe Biden in the New Hampshire primary elections in January, and to create the softfakes of former Pakistani prime minister Imran Khan during his 2024 campaign outreach from a prison cell. Rather than pass softfake regulation on companies, which could stifle allowable uses such as parody, I instead suggest establishing election standards on GAI use. There is a long history of laws that limit when, how and where candidates can campaign, and what they are allowed to say.


Citizens have a part to play as well. We all know that you cannot trust what you read on the Internet. Now, we must develop the reflexes to not only spot altered media, but also to avoid the emotional urge to think that candidates’ softfakes are ‘funny’ or ‘cute’. The intent of these isn’t to lie to you — they are often obviously AI generated. The goal is to make the candidate likeable.


Softfakes are already swaying elections in some of the largest democracies in the world. We would be wise to learn and adapt as the ongoing year of democracy, with some 70 elections, unfolds over the next few months.


COMPETING INTERESTS

The author declares no competing interests.

[my yellow highlighting]



Charles Stuart University, Expert Alert, media release, 12 April 2024, excerpt:


Governments must crack down on AI interfering with elections


Charles Darwin University Computational and Artificial Intelligence expert Associate Professor Niusha Shafiabady.


Like it or not, we are affected by what we come across in social media platforms. The future wars are not planned by missiles or tanks, but they can easily run on social media platforms by influencing what people think and do. This applies to election results.


Microsoft has said that the election outcomes in India, Taiwan and the US could be affected by the AI plays by powers like China or North Korea. In the world of technology, we call this disinformation, meaning producing misleading information on purpose to change people’s views. What can we do to fight these types of attacks? Well, I believe we should question what we see or read. Not everything we hear is based on the truth. Everyone should be aware of this.


Governments should enforce more strict regulations to fight misinformation, things like: Finding triggers that show signs of unwanted interference; blocking and stopping the unauthorised or malicious trends; enforcing regulations on social media platforms to produce reports to the government to demonstrate and measure the impact and the flow of the information on the matters that affect the important issues such as elections and healthcare; and enforcing regulations on the social media platforms to monitor and stop the fake information sources or malicious actors.”


The Conversation, 10 April 2024:


Election disinformation: how AI-powered bots work and how you can protect yourself from their influence


AI Strategist and Professor of Digital Strategy, Loughborough University Nick Hajli



Social media platforms have become more than mere tools for communication. They’ve evolved into bustling arenas where truth and falsehood collide. Among these platforms, X stands out as a prominent battleground. It’s a place where disinformation campaigns thrive, perpetuated by armies of AI-powered bots programmed to sway public opinion and manipulate narratives.


AI-powered bots are automated accounts that are designed to mimic human behaviour. Bots on social media, chat platforms and conversational AI are integral to modern life. They are needed to make AI applications run effectively......


How bots work


Social influence is now a commodity that can be acquired by purchasing bots. Companies sell fake followers to artificially boost the popularity of accounts. These followers are available at remarkably low prices, with many celebrities among the purchasers.


In the course of our research, for example, colleagues and I detected a bot that had posted 100 tweets offering followers for sale.


Using AI methodologies and a theoretical approach called actor-network theory, my colleagues and I dissected how malicious social bots manipulate social media, influencing what people think and how they act with alarming efficacy. We can tell if fake news was generated by a human or a bot with an accuracy rate of 79.7%. It is crucial to comprehend how both humans and AI disseminate disinformation in order to grasp the ways in which humans leverage AI for spreading misinformation.


To take one example, we examined the activity of an account named “True Trumpers” on Twitter.



The account was established in August 2017, has no followers and no profile picture, but had, at the time of the research, posted 4,423 tweets. These included a series of entirely fabricated stories. It’s worth noting that this bot originated from an eastern European country.




Research such as this influenced X to restrict the activities of social bots. In response to the threat of social media manipulation, X has implemented temporary reading limits to curb data scraping and manipulation. Verified accounts have been limited to reading 6,000 posts a day, while unverified accounts can read 600 a day. This is a new update, so we don’t yet know if it has been effective.


Can we protect ourselves?

However, the onus ultimately falls on users to exercise caution and discern truth from falsehood, particularly during election periods. By critically evaluating information and checking sources, users can play a part in protecting the integrity of democratic processes from the onslaught of bots and disinformation campaigns on X. Every user is, in fact, a frontline defender of truth and democracy. Vigilance, critical thinking, and a healthy dose of scepticism are essential armour.


With social media, it’s important for users to understand the strategies employed by malicious accounts.


Malicious actors often use networks of bots to amplify false narratives, manipulate trends and swiftly disseminate misinformation. Users should exercise caution when encountering accounts exhibiting suspicious behaviour, such as excessive posting or repetitive messaging.


Disinformation is also frequently propagated through dedicated fake news websites. These are designed to imitate credible news sources. Users are advised to verify the authenticity of news sources by cross-referencing information with reputable sources and consulting fact-checking organisations.


Self awareness is another form of protection, especially from social engineering tactics. Psychological manipulation is often deployed to deceive users into believing falsehoods or engaging in certain actions. Users should maintain vigilance and critically assess the content they encounter, particularly during periods of heightened sensitivity such as elections.


By staying informed, engaging in civil discourse and advocating for transparency and accountability, we can collectively shape a digital ecosystem that fosters trust, transparency and informed decision-making.


Philadelphia Inquirer, 14 April 2024:

Expect to see AI ‘weaponized to deceive voters’ in this year’s presidential election

Alfred Lubrano


As the presidential campaign slowly progresses, artificial intelligence continues to accelerate at a breathless pace — capable of creating an infinite number of fraudulent images that are hard to detect and easy to believe.


Experts warn that by November voters in Pennsylvania and other states will have witnessed counterfeit photos and videos of candidates enacting one scenario after another, with reality wrecked and the truth nearly unknowable.


This is the first presidential campaign of the AI era,” said Matthew Stamm, a Drexel University electrical and computer engineering professor who leads a team that detects false or manipulated political images. “I believe things are only going to get worse.”


Last year, Stamm’s group debunked a political ad for then-presidential candidate Florida Republican Gov. Ron DeSantis ad that appeared on Twitter. It showed former President Donald Trump embracing and kissing Anthony Fauci, long a target of the right for his response to COVID-19.


That spot was a “watershed moment” in U.S. politics, said Stamm, director of his school’s Multimedia and Information Security Lab. “Using AI-created media in a misleading manner had never been seen before in an ad for a major presidential candidate,” he said.


This showed us how there’s so much potential for AI to create voting misinformation. It could get crazy.”


Election experts speak with dread of AI’s potential to wreak havoc on the election: false “evidence” of candidate misconduct; sham videos of election workers destroying ballots or preventing people from voting; phony emails that direct voters to go to the wrong polling locations; ginned-up texts sending bogus instructions to election officials that create mass confusion.....


Malicious intent


AI allows people with malicious intent to work with great speed and sophistication at low cost, according to the Cybersecurity & Infrastructure Security Agency, part of the U.S. Department of Homeland Security.


That swiftness was on display in June 2018. Doermann’s University of Buffalo colleague, Siwei Lyu, presented a paper that demonstrated how AI-generated deepfake videos could be detected because no one was blinking their eyes; the faces had been transferred from still photos.


Within three weeks, AI-equipped fraudsters stopped creating deepfakes based on photos and began culling from videos in which people blinked naturally, Doermann said, adding, “Every time we publish a solution for detecting AI, somebody gets around it quickly.”


Six years later, with AI that much more developed, “it’s gained remarkable capacities that improve daily,” said political communications expert Kathleen Hall Jamieson, director of the University of Pennsylvania’s Annenberg Public Policy Center. “Anything we can say now about AI will change in two weeks. Increasingly, that means deepfakes won’t be easily detected.


We should be suspicious of everything we see.”


AI-generated misinformation helps exacerbate already-entrenched political polarization throughout America, said Cristina Bicchieri, Penn professor of philosophy and psychology.


When we see something in social media that aligns with our point of view, even if it’s fake, we tend to want to believe it,” she said.


To battle fabrications, Stamm of Drexel said, the smart consumer could delay reposting emotionally charged material from social media until checking its veracity.


But that’s a lot to ask.


Human overreaction to a false report, he acknowledged, “is harder to resolve than any anti-AI stuff I develop in my lab.


And that’s another reason why we’re in uncharted waters.”


Sunday, 17 March 2024

Liberal MP for Dickson & Leader of the Coalition Opposition Peter Dutton called the CSIRO an unreliable scientific body producing "discredited" work and is now reaping what he has sown


"In the growing heat of debate over Coalition nuclear energy policy, Mr Dutton described the CSIRO’s GenCost report on the cost of electricity generation as “discredited” and “not a genuine piece of work” and suggested it was “well documented” that the CSIRO cannot be relied on." [InnovationAUS, 15 March 2024]





 

Open letter from Dr Doug Hilton, Chief Executive, CSIRO

15 MARCH 2024

NEWS RELEASE


Science is crucial to providing the data and models that allow society to tackle profound challenges; challenges like the COVID-19 pandemic, transition to net zero, keeping Australian industry productive and sustainable, and protecting our unique biodiversity.


For science to be useful and for challenges to be overcome it requires the trust of the community. Maintaining trust requires scientists to act with integrity. Maintaining trust also requires our political leaders to resist the temptation to disparage science.


As Chief Executive of CSIRO, I will staunchly defend our scientists and our organisation against unfounded criticism.


The GenCost report is updated each year and provides the very best estimates for the cost of future new-build electricity generation in Australia. The report is carefully produced, its methodology is clearly articulated, our scientists are open and responsive to feedback, and as is the case for all creditable science, the report is updated regularly as new data comes to hand.


The GenCost report can be trusted by all our elected representatives, irrespective of whether they are advocating for electricity generation by renewables, coal, gas or nuclear energy.


No matter the challenge we are tackling, CSIRO’s scientists and engineers can be relied on by the community to work creatively, assiduously and with integrity.


Dr Douglas Hilton

Chief Executive, CSIRO



Some of the mainstream media headlines generated by Dutton's attempt to deny the considerable downside of introducing nuclear power stations into Australia's energy grid........


The Guardian

CSIRO chief warns against ‘disparaging science’ after Peter Dutton criticises nuclear energy costings

Douglas Hilton says he will 'staunchly defend' scientists as opposition leader repeats incorrect claim that CSIRO report does not accurately...

15.03.24


Australian Broadcasting Corporation

Nation's science agency CSIRO hits back at Dutton claim that nuclear power costings were 'discredited'

The CSIRO has rebuked politicians seeking to undermine its research showing nuclear energy would be much more expensive than solar or wind...

15.03.24


The Sydney Morning Herald

Don’t disparage the science’: CSIRO hits back at Dutton on nuclear energy

Australia's top science agency has made a rare political intervention as CSIRO chief executive Doug Hilton defended his agency's findings on...

15.03.24


News.com.au

Nuclear question Dutton won’t answer

Peter Dutton has failed to answer a key question in a fiery clash with Bill Shorten over nuclear energy.

15.03.24


The New Daily

CSIRO hits back at Dutton's 'unfounded' criticism

Australia's national science agency has taken aim at Liberal leader Peter Dutton in a highly unusual public intervention.

15.03.24


The Canberra Times

CSIRO Chief defends GenCost report from political attack

Dr. Doug Hilton stands by CSIRO's GenCost report findings amidst political criticism from Coalition leader Peter Dutton.

15.03.24


The Age

CSIRO hits back at Dutton attack on its nuclear energy reports

Australia's top science agency has made a rare political intervention, with CSIRO chief executive Douglas Hilton defending his agency's...

15.03.24


Hunter Valley News

CSIRO boss defends scientists after Dutton attack

CSIRO chief executive officer Douglas Hilton has issued a rare public statement to urge politicians to "resist the...

15.03.24


The Wimmera Mail-Times

CSIRO boss defends scientists after Dutton attack

CSIRO chief executive officer Douglas Hilton has issued a rare public statement to urge politicians to "resist the...

15.03.24


Then on the same day the Centre for Independent Studies, a conservative seemingly pro-nuclear 'think tank' which also supported the No position in the 2023 national referendum and whose executive director just happened in 2008 to have been a senior adviser to former federal Liberal Party Leader Brendan Nelson and in 2009 was himself he a candidate to replace Nelson in his northern Sydney electoral seat of Bradfield, attempted to ride to Peter Dutton's rescue on social media with a whitewash of the Opposition Leader's comments and an interesting interpretation of the contents of CSIRO news release.


Centre for Independent Studies @CISOZ

CIS responds to @CSIRO's open letter.


"Not all criticisms are unfounded. If the CEO wants to defend the methods and conclusions of a particular report from criticism, he should do just that, rather than simply asserting that the report can be trusted when serious flaws still exist."

15.03.24


Thursday, 25 January 2024

Byron Echo responds to claims in the Australian Jewish Association advertisement: an "extreme organisation" which "promotes one-sided warped and dangerous views"

 

The Byron Shire Echo
24 January 2024, p.2
Click on image to enlarge

~~~~~~~~~~~~~~~~~~~~~


The Byron Shire Echo

Volume 38 #33 • January 24, 2024


We’ve had complaints


The Australian Jewish Association (AJA) sure did stir up a lot of complaints to The Echo, given their inflammatory and inaccurate statements in their page 5 advertisement last week.


There is no evidence that Hamas is responsible for all Gazan deaths, for example, and the rest of their claims around the Middle East peace process are contested – at best.


But that’s free speech.


It’s easy to say you support it, but harder as a publisher to actually follow through with it.


So who are the AJA (www.jewishassociation.org.au)?


Despite claiming they do not affiliate with any political party, their members are linked to far-right-wing think tank IPA, and the Liberal Party.


AJA tweet Jan 17


On January 17, the AJA tweeted

their Echo ad with the statement:

Byron Bay is known as a hot spot

for left-leaning activist types. The

local paper, The Byron Shire Echo,

is widely read and often contains

anti-Israel content. AJA decided to

take out a half page ad and share

some facts. The ad was generously

facilitated by Michael Burd. What

do you think of the AJA ad?’


Well here’s what we think:


The AJA is an organisation that should not be taken seriously, because it only promotes one-sided warped and dangerous views.


It is simply untrue to imply that Israel is an innocent bystander/victim in the unfolding clusterfuck.


Their ad appeared designed to divide rather than inform.


Unlike most mainstream media, The Echo is independent and contains a range of views from its readers and contributors.


Saying The Echo ‘often contains anti-Israel content’ is like saying that those who criticise Israel are antiSemite, or are ‘self-loathing Jews’.


As a society, aren’t we past such stupid school-yard bullying?


The AJA appears more aligned with war-mongering types, like Israeli PM, Bibi Netanyahu, than with those seeking genuine peace.


Netanyahu has forged a hardright coalition to remain in power, yet faces much criticism from within Israel over his attempts to curb the powers of the judiciary, for example. Netanyahu also faces court on charges of corruption, and it appears he needs this war to stay in power. There is no two-state solution with Netanyahu.


As for Australian mainstream media (especially Newscorp), they appear ‘state captured’ – that is, they take paid junkets to Israel and subsequently write favourably about Israel’s policies.


By contrast, Israel’s own media is often critical of its government, and there appears more freedom to report without fear or favour in Israel than here in Australia.


ABC journalist, Antoinette Lattouf, was recently sacked after posting a Human Rights Watch video describing Israel’s starvation of Gaza civilians. It’s alleged that the national broadcaster took action after complaints from ‘Jewish lobbyists’.


The Echo is not aligned with either side of the Israel and Palestine conflict.


It’s all too easy to shoot the messenger.


Instead, genuine attempts at peace are required by ‘leaders’ if this intractable ongoing disaster has any hope of resolution.


The Australian Jewish Association (AJA), like any extreme organisation on both sides of this conflict, are not helping that cause.


Hans Lovejoy, editor


~~~~~~~~~~~~~~~~~~~~~


Letter to the Editor, 24 January 2024, p.12:


I live in Byron Bay and always look forward to reading The Echo. I was beyond shocked and horrified at seeing the Zionist propaganda advertisement on page 5 by the Australian Jewish Association (AJA). I certainly didn’t expect this in a Northern Rivers local newspaper.


As a non-practising Jewish woman who is completely appalled at what is happening in Gaza, due to the actions of Israeli leaders and IDF, I’m so disappointed with your choice to accept this advertisement.


Gaza has been under control by Israel for a very long time. They control all access into and out of Gaza with multiple checkpoints for children, women and men. The IDF shoots children throwing stones at them.


I don’t condone the actions of Hamas on 7 October, 2023. Killing civilians is not okay. However, the claims of beheading, rape and burning babies are not verified. Israel’s response was, and is, reprehensible and repugnant. Indiscriminate bombing of civilians and infrastructure is against humanitarian laws. Israel are committing war crimes and genocide against a race of people displaced in 1948 and harassed since then.


Entire families have been wiped out. Journalists, UN aid workers and medical staff are being killed by the IDF. No one is safe in Gaza.


The occupied (by Israel) West Bank is also an unsafe place for Palestinians to live, work and play. Settlers use violence to displace Palestinians from their homes and agricultural land.


To condone Israel is to condone genocide.


Linda Teese

Byron Bay


NOTE


The Byron Shire Echo is a free weekly independent tabloid newspaper that is published in the Byron Shire, New South Wales, Australia....

The Echo is totally owned by people who live in Byron Shire.


It is published by Echo Publications Pty Ltd, an Australian proprietary company, limited by shares, registered in 1990.