ForgottenFlux commented on Google's AI-powered search summaries use 10x more energy than a standard Google search | The Hidden Environmental Impact of AI • technology •
AI's rapid growth has transformed digital life, but its significant environmental impact remains largely unchecked.
AI-powered features can consume up to 10 times more electricity than traditional searches, potentially equating to a country's power usage.
The proliferation of energy-intensive data centers powering AI is outpacing the electric grid's capacity, forcing utilities to maintain fossil fuel plants for reliability.
Estimates suggest AI could account for 9% of U.S. energy demand by 2030, substantially contributing to climate change.
Lack of industry transparency and mandatory reporting makes quantifying AI's full environmental toll difficult.
Tech companies negotiate discounted utility rates, shifting costs to ratepayers and reducing incentives for energy efficiency.
Government regulation has been slow and industry-influenced, focusing on hypothetical future risks over current, tangible harms.
The burden of AI's environmental impact disproportionately falls on Global South communities where data centers are located.
Tech companies resist mandatory disclosures, prioritizing profits over sustainability while the public bears the physical costs.
Signal's desktop app stores encryption keys for chat history in plaintext, making them accessible to any process on the system
Researchers were able to clone a user's entire Signal session by copying the local storage directory, allowing them to access the chat history on a separate device
This issue was previously highlighted in 2018, but Signal has not addressed it, stating that at-rest encryption is not something the desktop app currently provides
Some argue this is not a major issue for the "average user", as other apps also have similar security shortcomings, and users concerned about security should take more extreme measures
However, others believe this is a significant security flaw that undermines Signal's core promise of end-to-end encryption
A pull request was made in April 2023 to implement Electron's safeStorage API to address this problem, but there has been no follow-up from Signal
Signal's desktop app stores encryption keys for chat history in plaintext, making them accessible to any process on the system
Researchers were able to clone a user's entire Signal session by copying the local storage directory, allowing them to access the chat history on a separate device
This issue was previously highlighted in 2018, but Signal has not addressed it, stating that at-rest encryption is not something the desktop app currently provides
Some argue this is not a major issue for the "average user", as other apps also have similar security shortcomings, and users concerned about security should take more extreme measures
However, others believe this is a significant security flaw that undermines Signal's core promise of end-to-end encryption
A pull request was made in April 2023 to implement Electron's safeStorage API to address this problem, but there has been no follow-up from Signal
Netflix is discontinuing its cheapest ad-free subscription tier, starting with the UK and Canada, with more countries expected to follow.
Netflix has begun notifying users about the last day they can access the service on the Basic plan, prompting them to upgrade to the Standard with ads or more expensive Standard/Premium plans.
In Canada:
Original Basic plan price: $9.99/month
New Standard plan price: $16.49/month
New Standard with ads price: $5.99/month
Increase from Basic to Standard: $6.50/month (65% increase)
In the UK:
Original Basic plan price: £7.99/month
New Standard with ads price: £4.99/month
New Standard plan price: £10.99/month
Increase from Basic to Standard: £3.00/month (37.5% increase)
The Basic plan ($11.99/month) is no longer available for new US subscribers.
Netflix's ad-supported tier now has 40 million global monthly active users, up from 35 million a year ago.
ForgottenFlux commented on Proton launches privacy-focused Google Docs alternative: Docs in Proton Drive is an open-source, end-to-end encrypted collaborative document editor • technology •
ForgottenFlux commented on As mind-reading technology improves, Colorado passes first-in-nation law to protect privacy of our thoughts • privacyguides •
Colorado passes first-in-nation law to protect privacy of biological or brain data, which is similar to fingerprints if used to identify people.
Advances in artificial intelligence have led to medical breakthroughs, including devices that can read minds and alter brains.
Neurotechnology devices, such as Emotiv and Somnee, are used for health care and can move computers with thoughts or improve brain function and identify impairments.
Most of these devices are not regulated by the FDA and are marketed for wellness.
With benefits come risks, such as insurance companies discriminating, law enforcement interrogating, and advertisers manipulating brain data.
Medical research facilities are subject to privacy laws, but private companies amassing large caches of brain data are not.
The Neurorights Foundation found that two-thirds of these companies are already sharing or selling data with third parties.
The new law takes effect on Aug. 8, but it is unclear which companies are subject to it and how it will be enforced.
Pauzauskie and the Neurorights Foundation are pushing for a federal law and even a global accord to prevent brain data from being used without consent.
Colorado passes first-in-nation law to protect privacy of biological or brain data, which is similar to fingerprints if used to identify people.
Advances in artificial intelligence have led to medical breakthroughs, including devices that can read minds and alter brains.
Neurotechnology devices, such as Emotiv and Somnee, are used for health care and can move computers with thoughts or improve brain function and identify impairments.
Most of these devices are not regulated by the FDA and are marketed for wellness.
With benefits come risks, such as insurance companies discriminating, law enforcement interrogating, and advertisers manipulating brain data.
Medical research facilities are subject to privacy laws, but private companies amassing large caches of brain data are not.
The Neurorights Foundation found that two-thirds of these companies are already sharing or selling data with third parties.
The new law takes effect on Aug. 8, but it is unclear which companies are subject to it and how it will be enforced.
Pauzauskie and the Neurorights Foundation are pushing for a federal law and even a global accord to prevent brain data from being used without consent.
Telegram founder Pavel Durov claimed in an interview that the company only employs "about 30 engineers."
Security experts say this is a major red flag for Telegram's cybersecurity, as it suggests the company lacks the resources to effectively secure its platform and fight off hackers.
Telegram's chats are not end-to-end encrypted by default, unlike more secure messaging apps like Signal or WhatsApp. Users have to manually enable the "Secret Chat" feature to get end-to-end encryption.
Telegram also uses its own proprietary encryption algorithm, which has raised concerns about its security.
As a social media platform with nearly 1 billion users, Telegram is an attractive target for both criminal and government hackers, but it seems to have very limited staff dedicated to cybersecurity.
Security experts have long warned that Telegram should not be considered a truly secure messaging app, and Durov's recent statement may indicate that the situation is worse than previously thought.