Feed slashdot Slashdot

Favorite IconSlashdot

Link https://slashdot.org/
Feed https://rss.slashdot.org/Slashdot/slashdotMain
Copyright Copyright Slashdot Media. All Rights Reserved.
Updated 2024-11-25 06:30
Apple's iPhone Loses Top Spot In China To Huawei
According to a report from Jefferies analysts, Huawei has overtaken Apple's iPhone as the smartphone market share leader in China. CNBC reports: The analysts said smartphone sales in China have showed positive growth year over year, driven primarily by high double-digit growth in Android sales led by Huawei, Xiaomi and Honor devices. But Apple's iPhone has seen a significant, double-digit decline, and its volume growth year over year has been negative since the iPhone 15 launched, according to the analysts. "We believe weak demand in China would eventually lead to lower-than-expected global shipments of iPhone 15 in 2023," the analysts wrote, adding that the trend suggests the iPhone will "lose" to Huawei next year. The Jefferies analysts wrote that Android's volume growth can't be chalked up to discounts and that discounts on iPhones, excluding the iPhone 15 models, have been stable, while the average discount for Android "is not high." The analysts noted that resale iPhone 15 devices are all "trading at discounts to official selling prices," which also reflects the weak demand in China.Read more of this story at Slashdot.
Cloudera Hit With $240 Million Patent Verdict Over Cloud-Storage Technology
An anonymous reader quotes a report from Reuters: Patent owner StreamScale won a $240 million jury verdict in Waco, Texas, federal court on Friday in a patent case against data-management software company Cloudera. The jury said (PDF) after a four-day trial that Cloudera infringed three StreamScale patents related to cloud-based data storage technology. Cloudera said in a statement that it intends to challenge the decision and that it would not impact the company's customers. StreamScale attorney Jason Sheasby called the verdict a "referendum on the importance of small inventors and small businesses." StreamScale owns patents for inventor Michael Anderson's "accelerated erasure coding" technology, which the company's complaint called a "cornerstone" of modern data storage. It sued Santa Clara, California-based Cloudera in 2021 for allegedly infringing several of its patents. The lawsuit accused Cloudera's CDH open source data-management platform of violating StreamScale's patent rights. Cloudera argued its software worked in a different way than StreamScale's inventions and said that the patents were invalid. StreamScale also accused other companies, including Intel, of infringing its patents in the 2021 lawsuit. Intel filed a separate lawsuit later that year arguing that StreamScale's allegations violated a non-disclosure agreement.Read more of this story at Slashdot.
Comcast Resists Call To Stop Its Misleading '10G Network' Claims
Jon Brodkin reports via Ars Technica: An advertising industry group urged Comcast to stop its "10G" ads or modify them to state that 10G is an "aspirational" technology rather than something the company actually provides on its cable network today. The National Advertising Division (NAD), part of the advertising industry's self-regulatory system run by BBB National Programs, ruled against Comcast after a challenge lodged by T-Mobile. In its decision announced Thursday, the NAD recommended that Comcast "discontinue its '10G' claims" or "modify its advertising to (a) make clear that it is implementing improvements that will enable it to achieve '10G' and that it is aspirational or (b) use '10G' in a manner that is not false or misleading, consistent with this decision." Comcast plans to appeal the decision, so it won't make any changes to marketing immediately. If Comcast loses the appeal and agrees to change its practices, it would affect more than just a few ads because Comcast now calls its entire broadband network "10G." "In February 2023, Comcast rebranded its fixed Internet network as 'Xfinity 10G Network' to signify technological upgrades to its network that are continuing to be implemented," the NAD said. Comcast's website claims that the "Xfinity 10G Network is already here! You'll see continual increases in network speed and reliability. No action is required on your part to join the Xfinity 10G Network." It also claims that 10G is "complementary" to the 5G mobile network.Read more of this story at Slashdot.
Apple Plans To Update iPhones In-Store Without Opening the Boxes
Malcolm Owen reports via AppleInsider: Writing in his "Power On" newsletter for Bloomberg, Mark Gurman claims that Apple has a system that can update the operating system of iPhones before they get sold. Crucially, it can do so without opening the box. Consisting of a "pad-like device," store employees place unopened iPhone boxes onto it to trigger an update. The pad wirelessly turns on the iPhone, runs the software update, then turns it off again. While only iPhones are mentioned in the report, it's plausible that the idea could be extended to other products in Apple's catalog. It is claimed that consumers may benefit from the system at Apple Stores before the end of 2023.Read more of this story at Slashdot.
New York Bill Would Require a Criminal Background Check To Buy a 3D Printer
An anonymous reader quotes a report from Gizmodo: New York residents eyeing a new 3D printer may soon have to submit a criminal background check if a newly proposed state bill becomes law. The recently introduced legislation, authored by state senator Jenifer Rajkumar, aims to snub out an increasingly popular loophole where convicted felons who would otherwise be prohibited from legally buying a firearm instead simply 3D print individual components to create an untraceable "ghost gun." If passed, New York would join a growing body of states placing restrictions on 3D printers in the name of public safety. The New York bill, called AB A8132, would require a criminal history background check for anyone attempting to purchase a 3D printer capable of fabricating a firearm. It would similarly prohibit the sale of those printers to anyone with a criminal history that disqualifies them from owning a firearm. As it's currently written, the bill doesn't clarify what models or makes of printers would potentially fall under this broad category. The bill defines a three-dimensional printer as a "device capable of producing a three-dimensional object from a digital model." "Three-dimensionally printed firearms, a type of untraceable ghost gun, can be built by anyone using a $150 three-dimensional printer," Rajkumar wrote in a memorandum explaining the bill. "This bill will require a background check so that three-dimensional printed firearms do not get in the wrong hands." The NYPD has reported a 60% increase in seized ghost guns over the past two years. Meanwhile, on a national level, the Bureau of Alcohol, Tobacco, Firearms, and Explosives reported a 1083% increase in ghost gun recoveries from 2017-2021, figures they say are likely underreported.Read more of this story at Slashdot.
Analogue is Making a 4K Nintendo 64
Analogue, the company best known for modern takes on retro hardware, is turning its attention to the 64-bit era with the Analogue 3D, a reimagining of the Nintendo 64. From a report: The company says the new console will have "100 percent compatibility" with N64 cartridges in every region and will even support 4K output. It will also include "Original Display Modes featuring reference quality recreations of specific model CRTs and PVMs" for the purists out there, along with Bluetooth support and four controller ports. Today's announcement is mostly a tease. While we have some details, there's no word on price or a specific release date beyond 2024. Analogue isn't even showing the hardware yet -- right now, we just have these brief glimpses of what appears to be the console, as well as the wireless 8BitDo controller that's launching alongside it.Read more of this story at Slashdot.
US Plans To Push Other Countries Not to Pay Hacker Ransoms
The US is pushing a group of governments to publicly commit to not make ransom payments to hackers ahead of an annual meeting of more than 45 nations in Washington later this month. From a report: Anne Neuberger, deputy national security adviser, told Bloomberg News that she is "incredibly hopeful" about enlisting support for such a statement but acknowledged it's a "hard policy decision." If members can't agree to the statement in advance of the meeting, then it will be included as a discussion point, she said. [...] The aim of the statement is to change that calculus, Neuberger said. "Ransom payments are what's driving ransomware," she said. "That's the reason we think it's so needed."Read more of this story at Slashdot.
Netflix Deepens Videogame Push
Last year Netflix put up a billboard on Los Angeles's Sunset Boulevard to poke fun at itself. It read: "Wait, Netflix Has Games?" The company is working hard to clear up any confusion. It is deepening its push into the videogame industry, taking advantage of the studios it has acquired in the past two years to create more titles based on popular Netflix movies and TV shows. WSJ: Though Netflix has up to now focused on mobile games -- which appeal to casual gamers and can be downloaded on a smartphone or tablet -- it is taking steps to expand into higher-end games that can be streamed from TVs or PCs. That approach would put it up against giants such as Sony and Microsoft, which just closed its $75 billion acquisition of Activision Blizzard, and would bring some significant technical challenges. Over the next several months, Netflix subscribers will be able to play games on their mobile devices based on hits such as Korean thriller "Squid Game" and supernatural comedy "Wednesday," according to people familiar with the situation. Similarly, Netflix is discussing games based on "Extraction," its Sherlock Holmes series and its "Black Mirror" series, the people said. Even as Netflix creates homegrown titles, it will continue to license the well-known games, from "Bloons TD 6" to "Classic Solitaire," that currently make up its catalog. It has discussed plans to release a game within the popular action-adventure series "Grand Theft Auto" from Take-Two Interactive Software through a licensing deal, some of the people said. The strategy rips a page from the streaming giant's playbook in Hollywood, where it built an audience based on reruns from other studios -- such as "Friends," "The Office" and "Breaking Bad" -- while gearing up machinery to churn out originals like "House of Cards" and "Stranger Things."Read more of this story at Slashdot.
Colorado Supreme Court Approves Use of Google Search Data in Murder Case
The Colorado Supreme Court ruled today that evidence gleaned from a warrant for Google's search data could be used in the prosecution of a teen who was charged with murder for a fire that killed five people in the Denver area. From a report: As police scrambled to solve the source of the 2020 blaze, they asked Alphabet's Google to provide information about people who searched for the address of the house that went up in flames, using a controversial technique known as a keyword search warrant. After some initial objections, Google provided data that enabled detectives to zero in on five accounts, leading to the arrest of three suspects in the case. Lawyers for one of the suspects, Gavin Seymour, who was found to have Googled the home's address 14 times in the days before the fire, argued that the keyword warrant constituted an illegal search and that any evidence from it should be suppressed. His motion is the first known challenge to the constitutionality of keyword search warrants. The case is ongoing. In its 74-page decision, the court found that law enforcement had acted in good faith when it obtained the warrant for the teen's search history. Still, it stressed that the findings were specific to the facts of the case, and it refrained from weighing in about the use of Google's search data more broadly.Read more of this story at Slashdot.
Binance To Halt New UK Customers From Using Crypto Exchange
Binance has suspended access to its crypto exchange for new users based in the UK, after a partnership with a third party to approve communications on its platform under new local rules was terminated by the country's watchdog. From a report: Any customers based in the UK not already signed up to Binance's platform were no longer able to join the exchange from 5 p.m. in London on Monday, according to a blog post published by Binance. The move puts the world's largest crypto exchange out of reach for new users in the UK, setting the scene for a battle by Binance to return to one of the sector's biggest markets outside of the US. The UK's financial promotions regime was widened starting on Oct. 8 to include cryptoasset service providers, regardless of their location. All crypto platforms are now required by the regulator to display clear risk warnings to UK-based consumers and meet higher technical standards, with all communications needing to be approved by an FCA-authorized firm. Penalties for not doing so include being added to the FCA's public warning list, as well as unlimited fines and prison time.Read more of this story at Slashdot.
AMD Pulls Graphics Driver After 'Anti-Lag+' Triggers Counter-Strike 2 Bans
AMD has taken down the latest version of its AMD Adrenalin Edition graphics driver after Counter-Strike 2-maker Valve warned that players using its Anti-Lag+ technology would result in a ban under Valve's anti-cheat rules. From a report: AMD first introduced regular Anti-Lag mitigation in its drivers back in 2019, limiting input lag by reducing the amount of queued CPU work when the processor was getting too far ahead of the GPU frame processing. But the newer Anti-Lag+ system -- which was first rolled out for a handful of games last month -- updates this system by "applying frame alignment within the game code itself," according to AMD. That method leads to additional lag reduction of up to 10 ms, according to AMD's data. That additional lag reduction could offer players a bit of a competitive advantage in these games (with the usual arguments about whether that advantage is "unfair" or not). But it's Anti-Lag+'s particular method of altering the "game code itself" that sets off warning bells for the Valve Anti-Cheat (VAC) system. After AMD added Anti-Lag+ support for Counter-Strike 2 in a version 23.10.1 update last week, VAC started issuing bans to unsuspecting AMD users that activated the feature. "AMD's latest driver has made their 'Anti-Lag/+' feature available for CS2, which is implemented by detouring engine dll functions," Valve wrote on social media Friday. "If you are an AMD customer and play CS2, DO NOT ENABLE ANTI-LAG/+; any tampering with CS code will result in a VAC ban." Beyond Valve, there are also widespread reports of Anti-Lag+ triggering crashes or account bans in competitive online games like Modern Warfare 2 and Apex Legends. But Nvidia users haven't reported any similar problems with the company's Reflex system, which uses SDK-level code adjustments to further reduce input lag in games including Counter-Strike 2.Read more of this story at Slashdot.
Stack Overflow Cuts 28% Workforce as the AI Coding Boom Continues
Coding help forum Stack Overflow is laying off 28 percent of its staff as it struggles toward profitability. From a report: CEO Prashanth Chandrasekar announced today that the company is "significantly reducing the size of our go-to-market organization," as well as "supporting teams" and other groups. After the team doubled its employee base last year, Chandrasekar told The Verge's Nilay Patel in an interview that about 45 percent of those hires were for its go-to-market sales team, which he said was "obviously the largest team." Prosus acquired Stack Overflow in a $1.8 billion deal in mid-2021.Read more of this story at Slashdot.
Intel Unveils 14th-gen 'Raptor Lake Refresh' CPUs With Speeds Up To 6GHz
Intel's latest 14th-gen Core desktop processors, "Raptor Lake Refresh," do away with the AI NPU and complex tiling system inside the recent 14th-gen "Meteor Lake" mobile chips. But AI is being used here, specifically to assist what gamers care about: improving game performance and CPU clock speeds. From a report: As expected, Intel's "refreshed" Raptor Lake chips offer modest performance improvements over their predecessors, while ushering in eventual platform upgrades like Thunderbolt 5. But there are boosts, such as a tweaked Intel 7 process that pushes turbo clock speeds up to 6GHz with the new Core i9-14900K and a new "Application Performance Optimization (APO)" feature that appears to optimize the CPU for a particular game. But -- and this is important, given inflation -- Intel is holding pricing (almost) steady. Prices in Intel's 14th-gen Core desktop S-series line will range from $589 for the 24-core, 32-thread Core i9-14900K down to the $294 14-core, 20-thread Core i5-1400KF, for a total of six new processors. This is the third straight generation in which Intel has left its processor prices virtually unchanged, including the 13th-gen Raptor Lake and the 12th-gen Alder Lake chip, whose slowest chip was priced at $264. Perhaps not surprisingly, Intel's not offering many direct generation-over-generation comparisons with its own processors, though it selected a few content-creation benchmarks to highlight with its Core i7-14700K. There, performance improvements range from 3 percent (Adobe Lightroom) to 18 percent (Autodesk). According to Roger Chandler, vice president and general manager of Intel's enthusiast PC and workstation business, the Core i7 features the best multithreaded performance on a Core i7 ever. Intel executives said the chipmaker had about 130 partners and customers for the 13th-gen launch, and expect the same for the debut of the 14th-gen Raptor Lake Refresh chips.Read more of this story at Slashdot.
LinkedIn To Lay Off Hundreds of People Amid Broader Restructuring
LinkedIn plans to lay off more than 660 people across its engineering, product, talent and finance teams, it announced Monday -- representing more than 3% of the company's global workforce. From a report: LinkedIn has now seen two major rounds of layoffs this year following its cutting in May of 716 jobs and shuttering of its Chinese app InCareer. Those cuts were announced alongside a broader restructuring of the firm's Global Business Organization. The job positions are being eliminated as part of broader efforts at the company to optimize around artificial intelligence. LinkedIn released a slew of new AI product features earlier this month, including an AI-assisted candidate discovery for recruiters, and AI-powered coaching for LinkedIn's premium subscribers.Read more of this story at Slashdot.
Minecraft Has Sold Over 300 Million Copies
Minecraft already has the distinction of being the bestselling video game of all time. Today, it adds more down to that particular feather in its cap with the announcement that it has sold a staggering 300 million copies. From a report: "As we approach the 15th anniversary, Minecraft remains one of the best-selling games of all time, with over 300 million copies sold, a milestone no one could have dreamed of when we were all placing our first blocks," Helen Chiang, head of Mojang Studios, said in a statement. Even the second bestselling video game of all time, Grand Theft Auto V, doesn't even come close to Minecraft's numbers, topping out at 185 million reported sales.Read more of this story at Slashdot.
Why the US Government Has $5 Billion in Bitcoin
The U.S. government is one of the world's biggest holders of bitcoin, but unlike other crypto whales, it doesn't care if the digital currency goes up or down in value. From a report: That is because Uncle Sam's stash of some 200,000 bitcoin was seized from cybercriminals and darknet markets. It is primarily offline in encrypted, password-protected storage devices known as hardware wallets that are controlled by the Justice Department, the Internal Revenue Service or another agency. What the federal government does with its bitcoin has long been a topic of interest among crypto traders because any sale could potentially swing prices or cause other ripple effects in the $1 trillion digital-asset market. The U.S. has been notoriously slow to convert its stash of bitcoin into dollars. It isn't HODLing, crypto parlance for "holding on for dear life" and never intending to sell. Nor is it waiting for bitcoin to go "to the moon" so it can sell its holdings for a hefty profit. Rather, that big pile of bitcoin is more a byproduct of a lengthy legal process than strategic planning. "We don't play the market. We basically are set by the timing in our process," said Jarod Koopman, executive director of the IRS's cyber and forensics services section, which oversees all activities focused on cybercrimes.Read more of this story at Slashdot.
Have Economists Contributed to Inequality?
A new book by Nobel prize-winning economist Angus Deaton"feels like an existential crisis," writes Fast Company, "as he questions his own legacy - and wonders whether policies prescribed by economists over the years have unintentionally contributed to inequality" in America. Angus Deaton: People who have a four-year college degree are doing pretty well. But if you go to the people who don't have a college degree, horrible things are happening to them... The opportunities are getting bigger and bigger, but the safety net's falling further and further away. . . I think of it as much broader than income inequality: People without a BA are like an underclass. They're dispensable... Fast Company: Why has Europe been able to avoid so many of these rises in inequality and "deaths of despair" and the U.S. hasn't? Deaton: Anne [Case, my wife] and I wrestled with that in our book Deaths of Despair. One reason is that we don't have any safety net here... The other story is we've got this hideous healthcare system... we're spending [almost] 20% of GDP. There's no other country that spends anything like that. That money comes out of other things we could have, like a safety net and a better education system. And it's not delivering much, except the healthcare providers are doing really quite well: the hospitals, the doctors, the pharma companies, the device manufacturers. Not only does it cost a lot, but we fund it in this really bizarre way, which is that for most people who are not old enough to qualify for Medicare, they get their health insurance through their employer... Fast Company : The theme of your new book seems to be something of an existential crisis for you as an economist. How much are economists to blame for some of these issues? Deaton: [...] I think there are some broad things that we didn't do very well. We bent the knee a little too much to the Chicago libertarian view, that markets could do everything. I'm not trying to say that I was right and everybody else was wrong. I was with the mob. I think we thought that financial markets were much safer than they'd been in the past, and we didn't have to worry about them as much. That was dead wrong. I think we were way overenthusiastic about hyperglobalization. We had this belief that people would lose their jobs but they'd find other, better jobs, and that really didn't happen. So there are a lot of things that I think are going to be seriously reconsidered over the next years. But he admits economists are short on solutions for economic inequality. "When they say, 'Well, what would work'" there's this uncomfortable silence where you feel foolish. Everybody's quoting [former Italian philosopher and politician Antonio] Gramsci [saying that] the old system is broken but the new system is struggling to be born. No one really knows what it's going to look like." The book is titled Economics in America: An Immigrant Economist Explores the Land of Inequality. But in the interview Deaton still remains hopeful about America, calling it "a very inventive place," and noting that in the field of economics "there's always hope and there's always change; economics is a very open profession, and it changes very quickly."Read more of this story at Slashdot.
Caltech Ends Its Wi-Fi Lawsuit Against Apple and Broadcom
An anonymous reader shared this report from the Verge:Caltech has had some ups (winning $1.1 billion) and some downs (losing the $1.1 billion award and being ordered to a trial on damages) since suing Apple and Broadcom in 2016 over Wi-Fi patents. Reuters reported this week that Caltech is dropping its yearslong lawsuit against Apple and Broadcom, about two months after the companies came to a "potential settlement." Caltech wrote in a filing with a US District Court in California that it would drop its claims "with prejudice," meaning it can't refile its case, and asked that Broadcom do so as well, stating later that Broadcom "does not oppose this request." Caltech also writes that it will dismiss its claims against Apple - again, "with prejudice." The filing then says that Caltech "respectfully requests that all counterclaims asserted by Apple also be dismissed."Read more of this story at Slashdot.
California Begins World's Largest Dam Removal/River Restoration Projects
Four California dams are now being dismantled, reports the Arizona Republic:Sometime in January, work crews will start drilling a tunnel at the base of a concrete dam on the Klamath River, near the California-Oregon border. The tunnel will begin the process of drawing down the reservoir behind the dam, known as Copco-1, and prepare the site to remove the dam fro the river. Time was, removing a dam in the West was unheard of. Dams were built to store water, generate electricity, manage the use of rivers for growers. But environmental activists started telling the story of how dams damage a river and its ecosystem, and Indigenous communities have told their stories of how dams took away traditional resources and food sources. And so, in recent years, we've seen more dams removed. In Arizona, the removal of a hydroelectric dam on Fossil Creek led to the restoration of a sparkling waterway and a habitat for fish, birds and other wildlife. California's dam-removal project began in June, reports the Los Angeles Time, when the smallest of the four dams was torn down by crews using heavy machinery. "The other three dams are set to be dismantled next year, starting with a drawdown of the reservoirs in January.""The scale of this is enormous," said Mark Bransom, CEO of the nonprofit Klamath River Renewal Corp., which is overseeing dam removal and river restoration efforts. "This is the largest dam removal project ever undertaken in the United States, and perhaps even the world." The $450-million budget includes about $200 million from ratepayers of PacifiCorp, who have been paying a surcharge for the project. The Portland-based utility - part of billionaire Warren Buffett's conglomerate Berkshire Hathaway - agreed to remove the aging dams after determining it would be less expensive than trying to bring them up to current environmental standards. The dams were used purely for power generation, not to store water for cities or farms. "The reason that these dams are coming down is that they've reached the end of their useful life," Bransom said. "The power generated from these dams is really a trivial amount of power, something on the order of 2% of the electric utility that previously owned the dams." An additional $250 million came through Proposition 1, a bond measure passed by California voters in 2014 that included money for removing barriers blocking fish on rivers. Crews hired by the contractor Kiewet Corp. have been working on roads and bridges to prepare for the army of excavators and dump trucks. "We have thousands of tons of concrete and steel that make up these dams that we have to remove," Bransom said. "We'll probably end up with 400 to 500 workers at the peak of the work..." In addition to tearing down the dams, the project involves restoring about 2,200 acres of reservoir bottom to a natural state.Read more of this story at Slashdot.
How Two Florida Men Scammed 'Uber Eats' Out of $1 Million
An anonymous Slashdot reader shared this report from Business Insider:Two men from the Fort Lauderdale, Florida area scammed Uber Eats out of more than $1 million over 19 months, local police say. The suspects carried out the scheme - which began in January 2022 - by creating fake accounts on the Uber Eats app to act as both the customer and courier when placing grocery orders, the Broward County Sheriff's Office said in a statement. This worked because Uber Eats provides couriers with prepaid cards they can use to purchase up to $700 to complete customers' orders. Police claim the suspects would show up as couriers for their fake grocery orders before canceling them and using the prepaid cards to purchase gift cards at the stores. According to the sheriff's office, "On January 24, 2023, detectives conducted a surveillance operation and observed Morgan and Blackwood travel to 27 different Walgreens committing fraud that totaled a $5,013.28 loss for Uber that day. "Read more of this story at Slashdot.
Google's AI-Powered 'Project Green Light' Speeds Traffic, Reduces Fuel Consumption and Carbon Emissions
Google's "Project Green Light" uses machine learning on Maps data to optimize the length of green lights, reports Engadget, "reducing idle times as well as the amount of braking and accelerating vehicles have to do there."When the program was first announced in 2021, it had only been pilot tested in four intersections in Israel in partnership with the Israel National Roads Company but Google had reportedly observed a "10 to 20% reduction in fuel and intersection delay time" during those tests. The pilot program has grown since then, spreading to a dozen partner cities around the world, including Rio de Janeiro, Brazil; Manchester, England and Jakarta, Indonesia. "Today we're happy to share that... we plan to scale to more cities in 2024," Yael Maguire, Google VP of Geo Sustainability, told reporters during a pre-brief event last week. "Early numbers indicate a potential for us to see a 30% reduction in stops...." Maguire also noted that the Manchester test reportedly saw improvements to emission levels and air quality rise by as much as 18%. The company also touted the efficacy of its Maps routing in reducing emissions, with Maguire pointing out at it had "helped prevent more than 2.4 million metric tons of carbon emissions - the equivalent of taking about 500,000 fuel-based cars off the road for an entire year."Read more of this story at Slashdot.
Long-Dormant Viruses Are Now Waking Up After 50,000 Years as Planet Warms
This week Bloomberg explored so-called "zombie viruses" - that is, long-dormant microbes which they call "yet another risk that climate change poses to public health" as ground that's been frozen for "milleniums" suddenly starts thawing - for example, in the Arctic, which they write is warming "faster than any other area on earth."With the planet already 1.2C warmer than pre-industrial times, scientists are predicting the Arctic could be ice-free in summers by 2030s. Concerns that the hotter climate will release trapped greenhouse gases like methane into the atmosphere as the region's permafrost melts have been well-documented, but dormant pathogens are a lesser explored danger. Last year, virologist Jean-Michel Claverie's team published research showing they'd extracted multiple ancient viruses from the Siberian permafrost, all of which remained infectious... Ways in which this could present a threat are still emerging. A heat wave in Siberia in the summer of 2016 activated anthrax spores, leading to dozens of infections, killing a child and thousands of reindeer. In July this year, a separate team of scientists published findings showing that even multicellular organisms could survive permafrost conditions in an inactive metabolic state, called cryptobiosis. They successfully reanimated a 46,000-year-old roundworm from the Siberian permafrost, just by re-hydrating it... Claverie first showed "live" viruses could be extracted from the Siberian permafrost and successfully revived in 2014. For safety reasons his research focused only on viruses capable of infecting amoebas, which are far enough removed from the human species to avoid any risk of inadvertent contamination. But he felt the scale of the public health threat the findings indicated had been under-appreciated or mistakenly considered a rarity. So, in 2019, his team proceeded to isolate 13 new viruses, including one frozen under a lake more than 48,500 years ago, from seven different ancient Siberian permafrost samples - evidence to their ubiquity. Publishing the findings in a 2022 study, he emphasized that a viral infection from an unknown, ancient pathogen in humans, animals or plants could have potentially "disastrous" effects. "50,000 years back in time takes us to when Neanderthal disappeared from the region," he says. "If Neanderthals died of an unknown viral disease and this virus resurfaces, it could be a danger to us."Read more of this story at Slashdot.
Rust-Based 'Resources' is a New, Modern System Monitor for Linux
An anonymous reader shared this article from the Linux blog OMG! Ubuntu:The System Monitor app Ubuntu comes with does an okay job of letting you monitor system resources and oversee running processes - but it does look dated... [T]he app's graphs and charts are tiny, compact, and lack the glanceability and granular-detail that similar tools on other systems offer. Thankfully, there are plenty of ace System Monitor alternatives available on Linux, with the Rust-based Resources being the latest tool to the join the club. And it's a real looker... Resources shows real-time graphs showing the utilisation of core system components... You can also see a [sortable and searchable] list of running apps and processes, which are separated in this app. It's also possible to select a refresh interval "from very slow/slow/normal/fast/very fast (though tempting to select, 'very fast' can increase CPU usage)." And selecting an app or process "activates a big red button you can click to 'end' the app/process (a submenu has options to kill, halt, or continue the app/process instead)..." "If you don't like the 'Windows-iness' of Mission Center - which you may have briefly spotted it in my Ubuntu 23.10 release video - then Resources is a solid alternative."Read more of this story at Slashdot.
Dropbox CEO Defends 90% Remote-Work Model, Says 'Future of Work' is Here
An anonymous Slashdot reader shared this report from Fortune:What would Drew Houston, CEO of Silicon Valley software giant Dropbox, say to fellow CEOs - like Google's Sundar Pichai or Meta's Mark Zuckerberg - who seem to believe that three days a week in-person is crucial for company culture? "I'd say, 'your employees have options,'" Houston told Fortune this past week. "They're not resources to control." While Dropbox used to work near-entirely at its Bay Area headquarters, Houston has completely warmed to a distributed model since the pandemic - and is mystified as to why other leaders haven't joined him. (Houston founded Dropbox in 2007, the year after he graduated from MIT, and has been its CEO ever since.) "From a product design perspective, customers are our employees. We've stitched together this working model based on primary research," he told Fortune at Dropbox's WIP Conference - its first in-person event since 2019 - in New York on Tuesday. "We've just been handed the keys that unlock this whole future of work, which is actually here." In April 2021, right when most of the country became eligible for vaccines and people began reconvening again across the globe, Dropbox encouraged the opposite. It officially announced its intent to go Virtual First, which meant employees were free to work remotely 90% of the time, only commuting in for the occasional meeting or happy hour... Granted, not everyone got to appreciate the perks. In April, Dropbox laid off 500 employees - 16% of its staff - due to "slowing growth" and "the A.I. era" requiring a reallocation of resources.... Houston and his team have found, in practice, a handful of two- or three-day offsites per quarter - 10% of the year - works best for their people. Crucially, it provides that oft-referenced cultural connect and brainstorming time that pro-office zealots insist upon, without exhausting workers out with a commute grind or needless hours in drab conference rooms.Read more of this story at Slashdot.
Two 'Godzilla' Scifi Novellas Finally Get English Translations, Capturing 1950s Horror at Nuclear Weapons
Godzilla and Godzilla Raids Again - two novellas based on Toho's first two Godzilla movies - were finally published in an English translation this month. Both were written by science fiction author Shigeru Kayama, "who also penned the original scenarios from which the films in question were based," according to Our Culture magazine. And the book's translator calls Kayama both "a figure who is a little bit like Philip K. Dick in this country" and "the key person who developed the contours of the Godzilla story. I think it is no exaggeration to say that he perhaps the closest to being Godzilla's real father than anyone else. Without him, the monster we have today wouldn't exist."The original Godzilla film is a deeply powerful, mournful film that isn't just about a big monster stomping on buildings. It is a serious reflection on Japan's nuclear fears during the Cold War, which left it caught between heavily armed superpowers. Japan recognized that radioactive weapons of mass destruction being developed by the U.S. and U.S.S.R were threats that had the power to suddenly emerge and destroy its citizens and cities at any moment - like Godzilla. We should remember that in the film, it was hydrogen bomb testing in the Pacific that disturbed Godzilla, who then took revenge for his destroyed habitat by trampling Tokyo and blasting it with atomic rays... Interestingly, in the novellas that I've translated, Kayama sometimes restored elements that the director and his assistants removed in the moviemaking process. Perhaps the most noticeable one is that in the scenario, Kayama wanted to begin with a long voice-over that talks directly about the horrors of atomic and hydrogen bombs. He envisioned that as the voice was speaking, the screen would show images from historical footage of Hiroshima and Nagasaki, as well as images of the tremendously unlucky (and ironically named) fishing vessel Lucky Dragon No. 5, which accidentally found itself in the path of an H-bomb test in the South Pacific in early 1954. (The horrific fate of this boat directly inspired the producer at Toho Studios to make the film.) However, the director of the film, IshirA Honda, and his assistant who helped with the screenplay both felt that this kind of direct commentary was too direct for a popular film, and so they toned down the "protest" element in the story. It's clear that they, like Kayama, wanted Godzilla to serve as a monstrous embodiment of radiation and all of the destruction that it could bring, but they also didn't point fingers at the U.S. military which had dropped the atomic bombs on Hiroshima and Nagasaki and was busily developing even more horrifying weapons. After all, the U.S.S.R. had built its own arsenal, and so nuclear weapons no longer belonged to a single country - the threat was broader than that. Plus, protest films rarely attracted a big, popular following. So, Honda and his crew toned down the outspoken language and imagery, but there was still imagery left enough for viewers in 1954 to recall Hiroshima, Nagasaki, and the Lucky Dragon. Interestingly, when Kayama published the novellas, he included an author preface that talks about the anti-nuclear movement and encourages readers to read Godzilla and Godzilla Raids Again as his contribution to that movement. Next the translator hopes to create an English translation of the novel The Luminous Fairies and Mothra. But for this book, he struggled with how to assign a gender to Godzilla. "Some people feel very viscerally, like the people at Toho studios feel very strongly that Godzilla is an 'it' and not a 'he' or 'she' or 'they,'" he told MovieWeb. "I kind of give my rationale for that choice in the afterward - Kayama thought about Godzilla as a stand-in for the nuclear bomb, and it was men in America who were developing the hydrogen bombs that frightened Japan so much in 1954. So maybe it's perhaps not inappropriate to call Godzilla 'he.'"Read more of this story at Slashdot.
'OK, So ChatGPT Just Debugged My Code. For Real'
ZDNet's senior contributing editor also maintains software, and recently tested ChatGPT on two fixes for bugs reported by users, and a new piece of code to add a new feature, It's a "real-world" coding test, "about pulling another customer support ticket off the stack and working through what made the user's experience go south." First... please rewrite the following code to change it from allowing only integers to allowing dollars and cents (in other words, a decimal point and up to two digits after the decimal point).ChatGPT responded by explaining a two-step fix, posting the modified code, and then explaining the changes. "I dropped ChatGPT's code into my function, and it worked. Instead of about two-to-four hours of hair-pulling, it took about five minutes to come up with the prompt and get an answer from ChatGPT."Next up was reformatting an array. I like doing array code, but it's also tedious. So, I once again tried ChatGPT. This time the result was a total failure. By the time I was done, I probably fed it 10 different prompts. Some responses looked promising, but when I tried to run the code, it errored out. Some code crashed; some code generated error codes. And some code ran, but didn't do what I wanted. After about an hour, I gave up and went back to my normal technique of digging through GitHub and StackExchange to see if there were any examples of what I was trying to do, and then writing my own code. Then he posted the code for a function handling a Wordpress filter, along with the question: "I get the following error. Why?"Within seconds, ChatGPT responded... Just as it suggested, I updated the fourth parameter of the add_filter() function to 2, and it worked! ChatGPT took segments of code, analyzed those segments, and provided me with a diagnosis. To be clear, in order for it to make its recommendation, it needed to understand the internals of how WordPress handles hooks (that's what the add_filter function does), and how that functionality translates to the behavior of the calling and the execution of lines of code. I have to mark that achievement as incredible - undeniably 'living in the future' incredible... As a test, I also tried asking ChatGPT to diagnose my problem in a prompt where I didn't include the handler line, and it wasn't able to help. So, there are very definite limitations to what ChatGPT can do for debugging right now, in 2023... Could I have fixed the bug on my own? Of course. I've never had a bug I couldn't fix. But whether it would have taken two hours or two days (plus pizza, profanity, and lots of caffeine), while enduring many interruptions, that's something I don't know. I can tell you ChatGPT fixed it in minutes, saving me untold time and frustration. The article does include a warning. "AI is essentially a black box, you're not able to see what process the AI undertakes to come to its conclusions. As such, you're not really able to check its work... If it turns out there is a problem in the AI-generated code, the cost and time it takes to fix may prove to be far greater than if a human coder had done the full task by hand." But it also ends with this prediction. "I see a very interesting future, where it will be possible to feed ChatGPT all 153,000 lines of code and ask it to tell you what to fix... I can definitely see a future where programmers can simply ask ChatGPT (or a Microsoft-branded equivalent) to find and fix bugs in entire projects."Read more of this story at Slashdot.
Report Finds Few Open Source Projects are Actively Maintained
"A recent analysis accounting for nearly 1.2 million open source software projects primarily across four major ecosystems found that only about 11% of projects were actively maintained," reports InfoWorld:In its 9th Annual State of the Software Supply Chain report, published October 3, software supply chain management company Sonatype assessed 1,176,407 projects and reported an 18% decline this year in actively maintained projects. Just 11% of projects - 118,028 - were receiving active maintenance. The report also found some new projects, unmaintained in 2022, now being maintained. The four ecosystems included JavaScript, via NPM; Java, via the Maven project management tool; Python, via the PyPI package index; and .NET, through the NuGet gallery. Some Go projects also were included. According to the report, 18.6% of Java and JavaScript projects that were being maintained in 2022 are no longer being maintained today. Other interesting findings:Nearly 10% reported security breaches due to open source vulnerabilities in the past 12 months.Use of AI and machine learning software components within corporate environments surged 135% over the last year.Read more of this story at Slashdot.
T2 Linux Discovers (Now Patched) AMD Zen 4 Invalid Opcode Speculation Bug
T2 SDE is not just a Linux distribution, but "a flexible Open Source System Development Environment or Distribution Build Kit," according to a 2022 announcement of its support for 25 CPU architectures, variants, and C libraries. ("Others might even name it Meta Distribution. T2 allows the creation of custom distributions with state of the art technology, up-to-date packages and integrated support for cross compilation.") And while working on it, Berlin-based T2 Linux developer Rene Rebe (long-time Slashdot reader ReneR) discovered random illegal instruction speculation on AMD Ryzen 7000-Series and Epyc Zen 4 CPU. ReneR writes:Merged to Linux 6.6 Git is a fix for the bug now known at AMD as Erratum 1485. The discovery was possible through continued high CPU load cross-compiling the T2 Linux distribution with support for all CPU architectures from ARM, MIPS, PowerPC, RISC-V to x86 (and more) for 33 build variants. With sustained high CPU load and various instruction sequences being compiled, pseudo random illegal instruction errors were observed and subsequently analyzed. ExactCODE Research GmbH CTO Rene Rebe is thrilled that working with AMD engineers lead to a timely mitigation to increase system stability of the still new and highest performance Zen4 platform. "I found real-world code that might be similar or actually trigger the same bugs in the CPU that are also used for all the Spectre Meltdown and other side-channel security vulnerability mitigations," Rebe says in a video announcement on YouTube. It took Rebe a tremendous amount of research, and he says now that "all the excessive work changed my mind. Mitigations equals considered harmful... If you want stable, reliable computational results - no, you can't do this. Because as Spectre Meltdown and all the other security issues have proven, the CPUs are nowadays as complex as complex software systems..."Read more of this story at Slashdot.
To 'Evolve' Windows Authentication, Microsoft Wants to Eventually Disable NTLM in Windows 11
An anonymous reader shared this report from Neowin:The various versions of Windows have used Kerberos as its main authentication protocol for over 20 years. However, in certain circumstances, the OS has to use another method, NTLM (NT LAN Manager). Today, Microsoft announced that it is expanding the use of Kerberos, with the plan to eventually ditch the use of NTLM altogether. In a blog post, Microsoft stated that NTLM continues to be used by some businesses and organizations for Windows authentication because it "doesn't require local network connection to a Domain Controller." It also is "the only protocol supported when using local accounts" and it "works when you don't know who the target server is." Microsoft states: These benefits have led to some applications and services hardcoding the use of NTLM instead of trying to use other, more modern authentication protocols like Kerberos. Kerberos provides better security guarantees and is more extensible than NTLM, which is why it is now a preferred default protocol in Windows.The problem is that while businesses can turn off NTLM for authentication, those hardwired apps and services could experience issues. That's why Microsoft has added two new authentication features to Kerberos. Microsoft's blog post calls it "the evolution of Windows authentication," arguing that "As Windows evolves to meet the needs of our ever-changing world, the way we protect users must also evolve to address modern security challenges..." So, "our team is building new features for Windows 11." Initial and Pass Through Authentication Using Kerberos, or IAKerb, "a public extension to the industry standard Kerberos protocol that allows a client without line-of-sight to a Domain Controller to authenticate through a server that does have line-of-sight."A local Key Distribution Center (KDC) for Kerberos, "built on top of the local machine's Security Account Manager so remote authentication of local user accounts can be done using Kerberos.""We are also fixing hard-coded instances of NTLM built into existing Windows components... shifting these components to use the Negotiate protocol so that Kerberos can be used instead of NTLM... NTLM will continue to be available as a fallback to maintain existing compatibility.""We are also introducing improved NTLM auditing and management functionality to give your organization more insight into your NTLM usage and better control for removing it.""Reducing the use of NTLM will ultimately culminate in it being disabled in Windows 11. We are taking a data-driven approach and monitoring reductions in NTLM usage to determine when it will be safe to disable."Read more of this story at Slashdot.
GNU's 40th Anniversary: the FSF's Meeting with Old and New Friends
Devin Ulibarri, the Free Software Foundation's outreach and communications coordinator, writes up an event he describes as meeting with some old and new friends:On Sunday, October 1, the Free Software Foundation (FSF) hosted a hackday to celebrate the fortieth anniversary of the GNU Project. Folks came from both near and far to join in the festivities at FSF headquarters, Boston, MA... Sadi moma bela loza, the Bulgarian melody from which The Free Software Song is set, could be heard faintly playing in a nearby room, its distinctive odd-metered tune performed by a fully-liberated X200... All in all, the event succeeded in our goal of welcoming both long-time members as well as introducing new people to free software and our cause. A few college students from local universities, for example, were able to ask questions seeking to better understand free software licenses and GNU Project history. We received multiple requests from attendees to host similar events again in the near future. And one parent, whose son played NetHack at the event, reported that, the following morning, his son asked to go to the FSF office after school to play it again. When playing he mastered the "vi" movement keys immediately. We hope they serve him well...! Happy hacking and please stay tuned for more FSF-hosted events, including LibrePlanet 2024!Read more of this story at Slashdot.
Climate-Driven Heat Extremes May Make Earth to Hot for Billions of Humans
An anonymous reader shared this report from Phys.org:If global temperatures increase by 1 degrees Celsius (C) or more than current levels, each year billions of people will be exposed to heat and humidity so extreme they will be unable to naturally cool themselves, according to interdisciplinary research from the Penn State College of Health and Human Development, Purdue University College of Sciences and Purdue Institute for a Sustainable Future... Humans can only withstand certain combinations of heat and humidity before their bodies begin to experience heat-related health problems, such as heat stroke or heart attack. As climate change pushes temperatures higher around the world, billions of people could be pushed beyond these limits... Results of the study indicate that if global temperatures increase by 2 degreesC above pre-industrial levels, the 2.2 billion residents of Pakistan and India's Indus River Valley, the one billion people living in eastern China and the 800 million residents of sub-Saharan Africa will annually experience many hours of heat that surpass human tolerance... Troublingly, researchers said, these regions are also in lower-to-middle income nations, so many of the affected people may not have access to air conditioning or any effective way to mitigate the negative health effects of the heat.Read more of this story at Slashdot.
Climate-Driven Heat Extremes May Make Earth Too Hot for Billions of Humans
An anonymous reader shared this report from Phys.org:If global temperatures increase by 1 degrees Celsius (C) or more than current levels, each year billions of people will be exposed to heat and humidity so extreme they will be unable to naturally cool themselves, according to interdisciplinary research from the Penn State College of Health and Human Development, Purdue University College of Sciences and Purdue Institute for a Sustainable Future... Humans can only withstand certain combinations of heat and humidity before their bodies begin to experience heat-related health problems, such as heat stroke or heart attack. As climate change pushes temperatures higher around the world, billions of people could be pushed beyond these limits... Results of the study indicate that if global temperatures increase by 2 degreesC above pre-industrial levels, the 2.2 billion residents of Pakistan and India's Indus River Valley, the one billion people living in eastern China and the 800 million residents of sub-Saharan Africa will annually experience many hours of heat that surpass human tolerance... Troublingly, researchers said, these regions are also in lower-to-middle income nations, so many of the affected people may not have access to air conditioning or any effective way to mitigate the negative health effects of the heat.Read more of this story at Slashdot.
C# Challenges Java in Programming Language Popularity
"The gap between C# and Java never has been so small," according to October's update for TIOBE's "Programming Community Index". "Currently, the difference is only 1.2%, and if the trends remain this way, C# will surpass Java in about 2 month's time."Java shows the largest decline of -3.92% and C# the largest gain of +3.29% of all programming languages (annually). The two languages have always been used in similar domains and thus have been competitors for more than 2 decades now. Java's decline in popularity is mainly caused by Oracle's decision to introduce a paid license model after Java 8. Microsoft took the opposite approach with C#. In the past, C# could only be used as part of commercial tool Visual Studio. Nowadays, C# is free and open source and it's embraced by many developers. There are also other reasons for Java's decline. First of all, the Java language definition has not changed much the past few years and Kotlin, its fully compatible direct competitor, is easier to use and free of charge. "Java remains a critical language in enterprise computing," argues InfoWorld, "with Java 21 just released last month and Java 22 due next March. And free open source binaries of Java still are available via OpenJDK." InfoWorld also notes TIOBE's ranking is different than other indexes. TIOBE's top 10:Python (14.82%)C (12.08%)C++ (10.67%)Java (8.92%)C# (7.71%)JavaScript (2.91%)Visual Basic (2.13%)PHP (1.9%)SQL (1.78%)Assembly (1.64%)And here's the Pypl Popularity of Programming Language (based on searches for language tutorials on Google):Python, with a 28.05% shareJava (15.88%)JavaScript (9.27%)C# (6.79%)C/C++ (6.59%)PHP (4.86%)R (4.45%)TypeScript (2.93%)Swift (2.69%)Objective-C (2.29%)Read more of this story at Slashdot.
Is Glass the Future of Storage?
"If we carry on the way we're going, we're going to have to concrete the whole planet just to store the data that we're generating," explains a deputy lab director at Microsoft Research Cambridge in a new video. Fortunately, "A small sheet of glass can now hold several terabytes of data, enough to store approximately 1.75 million songs or 13 years' worth of music," explains a Microsoft Research web page about "Project Silica". (Data is retrieved by a high-speed, computer-controlled microscope from a library of glass disks storing data in three-dimensional pixels called voxels):Magnetic storage, although prevalent, is problematic. Its limited lifespan necessitates frequent re-copying, increasing energy consumption and operational costs over time. "Magnetic technology has a finite lifetime," says Ant Rowstron, Distinguished Engineer, Project Silica. "You must keep copying it over to new generations of media. A hard disk drive might last five years. A tape, well, if you're brave, it might last ten years. But once that lifetime is up, you've got to copy it over. And that, frankly, is both difficult and tremendously unsustainable if you think of all that energy and resource we're using." Project Silica aims to break this cycle. Developed under the aegis of Microsoft Research, it can store massive amounts of data in glass plates roughly the size of a drink coaster and preserve the data for thousands of years. Richard Black, Research Director, Project Silica, adds, "This technology allows us to write data knowing it will remain unchanged and secure, which is a significant step forward in sustainable data storage." Project Silica's goal is to write data in a piece of glass and store it on a shelf until it is needed. Once written, the data inside the glass is impossible to change. Project Silica is focused on pioneering data storage in quartz glass in partnership with the Microsoft Azure team, seeking more sustainable ways to archive data. This relationship is symbiotic, as Project Silica uses Azure AI to decode data stored in glass, making reading and writing faster and allowing more data storage... The library is passive, with no electricity in any of the storage units. The complexity is within the robots that charge as they idle inside the lab, awakening when data is needed...Initially, the laser writing process was inefficient, but after years of refinement, the team can now store several TB in a single glass plate that could last 10,000 years. For a sense of scale, each plate could store around 3,500 movies. Or enough non-stop movies to play for over half a year without repeating. A glass plate could hold the entire text of War and Peace - one of the longest novels ever written - about 875,000 times. And most importantly, it can store data in a fraction of the space of a datacenter... Thanks to long-time Slashdot reader Kirschey for sharing the article.Read more of this story at Slashdot.
How a Series of Air Traffic Control Lapses Nearly Killed 131 People
Due to an air traffic control mistake in February, a FedEx cargo plane flew within 100 feet of a Southwest Airlines flight in February. The New York Times reports that the flight's 128 passengers "were unaware that they had nearly died." In a year filled with close calls involving US airlines, this was the one that most unnerved federal aviation officials: A disaster had barely been averted, and multiple layers of the vaunted US air-safety system had failed... But the errors by the controller - who has continued to direct some plane traffic in Austin, Texas - were far from the whole story, according to 10 current and former controllers there, as well as internal Federal Aviation Administration documents reviewed by the Times. Austin-Bergstrom, like the vast majority of US airports, lacks technology that allows controllers to track planes on the ground and that warns of imminent collisions. The result is that on foggy days, controllers can't always see what is happening on runways and taxiways. Some have even resorted to using a public flight-tracking website in lieu of radar. In addition, for years Austin has had a shortage of experienced controllers, even as traffic at the airport has surged to record levels. Nearly three-quarters of shifts have been understaffed. Managers and rank-and-file controllers have repeatedly warned that staffing levels pose a public danger. The controller on that February morning was working an overtime shift. In June, Stephen B. Martin, then Austin's top manager, and a local union representative wrote a memo pleading for more controllers. "Drastic steps are needed to allow the facility to adequately staff for existing traffic," they wrote to FAA and union officials. Austin is a microcosm of a systemic crisis. The safety net that underpins air travel in America is fraying, exposing passengers to potential tragedies like the episode in February. And yet the chair of America's National Transportation Safety Board calls the February incident "just one of seven serious close calls and near misses involving commercial airlines that we have initiated investigations on this year." Thanks to long-time Slashdot reader schwit1 for sharing the article.Read more of this story at Slashdot.
First 'Doctor Who' Writer Honored. His Son Contests BBC's Rights to 'Unearthly Child'
The BBC reports:Doctor Who's first writer could finally be recognised 60 years after he helped launch the hugely-popular series. Anthony Coburn penned the first four episodes of the sci-fi drama in 1963 - a story called An Unearthly Child. But after his second story did not air, the writer has been seen as a minor figure among some Doctor Who fans. However, a campaign to erect a memorial to Coburn in his home town of Herne Bay, Kent, is gathering pace a month ahead of the show's 60th anniversary. A local elected councillor told the BBC they're working to find a location for the memorial. The BBC writes that Coburn's episode - broadcast November 23, 1963 - "introduced the character of The Doctor, his three travelling companions, and his time and space machine, the TARDIS, stuck in the form of a British police box."Richard Bignell, a Doctor Who historian, believes Coburn played a significant role in sowing the seeds of the programme's success. He said: "Although the major elements that would go on to form the core of the series were devised within the BBC, as the scriptwriter for the first story, Coburn was the one who really put the flesh on the bones of the idea and how it would work dramatically. "Many opening episodes of a new television series can be very clunky as they attempt to land their audience with too much information about the characters, the setting and what's going to happen, but Coburn was very reserved in how much he revealed, preserving all the wonder and mystery." In 2013, the Independent reported:Mr Coburn's son claims that the BBC has been in breach of copyright since his father's death in 1977. He has demanded that the corporation either stop using the Tardis in the show or pay his family for its every use since then. Stef Coburn claims that upon his father's death, any informal permission his father gave the BBC to use his work expired and the copyright of all of his ideas passed to his widow, Joan. Earlier this year she passed it on to him. He said: "It is by no means my wish to deprive legions of Doctor Who fans (of whom I was never one) of any aspect of their favourite children's programme. The only ends I wish to accomplish, by whatever lawful means present themselves, involve bringing about the public recognition that should by rights always have been his due, of my father James Anthony Coburn's seminal contribution to Doctor Who, and proper lawful recompense to his surviving estate." Today jd (Slashdot reader #1,658) notes that Stef Coburn apparently has a Twitter feed, where this week Stef claimed he'd cancelled the BBC's license to distribute his father's episodes after being offered what he complained was "a pittance" to relicense them. In response to someone who asked "What do you actually gain from doing this though?" Stef Coburn replied: "Vengeance." But elsewhere Stef Coburn writes "There are OTHER as yet unfulfilled projects & aspirations of Tony's (of one of which, I was a significant part, in his final year), which I would like to see brought to fruition. If Doctor Who is my ONLY available leverage. So be it!" Stef Coburn also announced plans to publish his father's "precursor draft-scripts (At least one very different backstory; sans 'Timelords') plus accompanying notes, for the story that became 'The Tribe of Gum'."Read more of this story at Slashdot.
Australian Student Invents Affordable Electric Car Conversion Kit.
"Australian design student Alexander Burton has developed a prototype kit for cheaply converting petrol or diesel cars to hybrid electric," reports Dezeen magazine, "winning the country's national James Dyson Award in the process."Titled REVR (Rapid Electric Vehicle Retrofits), the kit is meant to provide a cheaper, easier alternative to current electric car conversion services, which Burton estimates cost AU$50,000 (26,400) on average and so are often reserved for valuable, classic vehicles. Usually, the process would involve removing the internal combustion engine and all its associated hardware, like the gearbox and hydraulic brakes, to replace them with batteries and electric motors. With REVR, those components are left untouched. Instead, a flat, compact, power-dense axial flux motor would be mounted between the car's rear wheels and disc brakes, and a battery and controller system placed in the spare wheel well or boot. Some additional off-the-shelf systems - brake and steering boosters, as well as e-heating and air conditioning - would also be added under the hood. By taking this approach, Burton believes he'll be able to offer the product for around AU$5,000 (2,640) and make it compatible with virtually any car... With REVR, people should be able to get several more years of life out of their existing cars. The kit would transform the vehicle into a hybrid rather than a fully electric vehicle, with a small battery giving the car 100 kilometres of electric range before the driver has to switch to the internal combustion engine... Borrowing a trick from existing hybrid vehicles, the kit uses a sensor to detect the position of the accelerator pedal to control both acceleration and braking. That means no changes have to be made to the car's hydraulic braking system, which Burton says "you don't want to have to interrupt". Thanks to Slashdot reader FrankOVD for sharing the news.Read more of this story at Slashdot.
Musicians Are Angry About Venues Taking T-shirt Money
The singer known as Tomberlin says their first five years in the music industry may have been a net loss, according to MarketWatch. Selling "merch[andise]" like t-shirts "is what really is covering your costs and hopefully helping you make, like, an actual profit." And then...After being told she would have to hand over more than 40% of the money she collected from selling T-shirts and other items, Tomberlin refused to sell her merchandise at the venue and publicly spoke about a practice she calls robbery - venues taking cuts from bands' merchandise sales... Other musicians are also speaking out about the practice, and their complaints seem to be having an effect. Industry giant Live Nation Entertainment Inc. announced recently that it would stop collecting merch fees at nearly 80 of the smaller clubs it owns and operates and provide all bands that play at those venues with an additional $1,500 in gas cards and cash. Musicians who spoke with MarketWatch remain unsatisfied, however. Because of the way the announcement is phrased, many think merch fees at Live Nation clubs are only being paused until the end of the year. The musicians said they also wonder about the roughly 250 other Live Nation concert facilities, as well as the hundreds of venues owned by other companies. A Live Nation spokesperson told MarketWatch the change is "open-ended." [...] As Tomberlin continues on her current tour, she wonders if she will be able to make a profitable career in music. Of all her ways of earning money, streaming services like Spotify and Apple Music provide "the least amount of money," she said, and with tours not leaving her with any cash at the end, she feels that even modest ambitions are out of reach. Musician Laura Jane Grace is even soliciting signers for an online petition demanding venues stop taking cuts of the musicians' merchandise sales... Thanks to Slashdot reader quonset for sharing the news.Read more of this story at Slashdot.
Startup Aims to Build Hundreds of Chip Factories with Prefab Parts and AI
"To meet the world's growing hunger for chips, a startup wants to upend the costly semiconductor fabrication plant with a nimbler, cheaper idea..." reports Fast Company, "an AI-enabled chip factory that can be assembled and expanded modularly with prefab pieces, like high-tech Lego bricks." In other words, they want to enable what is literally a fast company..."We're democratizing the ownership of semiconductor fabs," says Matthew Putman, referring to chip fabrication plants. Putman is the founder and CEO of Nanotronics, a New York City-based industrial AI company that deploys advanced optical solutions for detecting defects in manufacturing procedures. Its new system, called Cubefabs, combines its modular inspection tools and other equipment with AI, allowing the proposed chip factories to monitor themselves and adapt accordingly - part of what Putman calls an "autonomous factory." The bulk of the facility can be preassembled, flat-packed and put in shipping containers so that the facilities can be built "in 80% of the world," says Putman. Eventually, the company envisions hundreds of the flower-shaped fabs around the world, starting with a prototype in New York or Kuwait that it hopes to start building by the end of the year... Nanotronics says a single Cubefab installation could start at one acre with a single fab, and grow to a four-fab, six-acre footprint. Each fab could be built in under a year, the company says, with a four-fab installation estimated to cost under $100 million. Nanotronics declined to disclose how much it has raised for the project, but Putman says the company has previously raised $170 million from investors, including Peter Thiel and Jann Tallin, the Skype cofounder... A single automated Cubefab will need only about 30 people to operate, "and they don't have to be semiconductor experts," says Putman. "AI takes away that need for that specialization that you would normally need in a fab." [...] Putman also hopes automation will help further reduce the environmental impact of an industry that's notoriously resource-intensive and produces thousands of tons of waste a year, much of it hazardous. "Because you have the AI fixing the material and the device before it's manufactured, you have less waste of the final material," he says. Thanks to Slashdot reader tedlistens for sharing the news.Read more of this story at Slashdot.
Chinese Scientists Claim Record-Smashing Quantum Computing Breakthrough
From the South China Morning Post:Scientists in China say their latest quantum computer has solved an ultra-complicated mathematical problem within a millionth of a second - more than 20 billion years quicker than the world's fastest supercomputer could achieve the same task. The JiuZhang 3 prototype also smashed the record set by its predecessor in the series, with a one million-fold increase in calculation speed, according to a paper published on Tuesday by the peer-reviewed journal Physical Review Letters... The series uses photons - tiny particles that travel at the speed of light - as the physical medium for calculations, with each one carrying a qubit, the basic unit of quantum information... The fastest classical supercomputer Frontier - developed in the US and named the world's most powerful in mid-2022 - would take over 20 billion years to complete the same task, the researchers said. The article claims they've increased the number of photons from 76 to 113 in the first two versions, improving to 255 in the latest iteration. Thanks to long-time Slashdot reader hackingbear for sharing the news.Read more of this story at Slashdot.
US Antitrust Enforcer Continues Fighting Microsoft/Activision Deal, Calls it 'A Threat to Competition'
Yesterday America's Federal Trade Commission said it remained focused on its appeal opposing Microsoft's deal to buy Activision, reports Reuters. Reuters notes that Microsoft and Activision closed their transaction Friday "after winning approval from Britain on condition that they sell the streaming rights to Activision's games to Ubisoft Entertainment." But the U.S. Federal Trade Commission "has also fought the deal, and has an argument scheduled before an appeals court on December 6. The agency said on Friday that it remained focused on that appeal." An FTC spokesperson had this comment for Reuters. "The FTC continues to believe this deal is a threat to competition."Read more of this story at Slashdot.
Third-party Reddit App Narwhal Hopes To Survive Reddit's App Purge With Subscriptions
An anonymous reader shared this report from TechCrunch:After a nasty battle between the developers of third-party apps and Reddit management, ultimately resulting in a site-wide protest, many app makers were put out of business due to Reddit's price increases related to the usage of its API. Though the changes meant the loss of popular apps like Apollo, RIF (Reddit is Fun), ReddPlanent, Sync and BaconReader, one app, Narwhal, is attempting to make a comeback. The company announced this week that it will implement a subscription-based version of its app at $3.99 per month, promising an ad-free and privacy-focused experience. The new app will also include a Tip Jar to solicit donations to help keep the app afloat beyond the subscription fees and fund additional development work. Though not available at launch, the app's developer Rick Harrison (u/det0ur on Reddit and CTO at Meadow by day) says he's considering adding a small fee, perhaps $1 per month, to allow users to also check their notifications and messages... Notes Narwhal's developer, Reddit's fee will be "tens of thousands if not hundreds of thousands a month depending on how many people subscribe." To work, the app will need a critical mass of subscribers to cover its costs, but Harrison says he's fairly confident the model will work. "Also, with a simpler plan like this, I can offer a subscription on a Narwhal website for 30% less (no Apple cut)," Harrison wrote... Narwhal isn't the only Reddit client to attempt to remain in business despite Reddit's API pricing changes. Another, Relay, announced a multi-tier subscription plan where users have to choose one of six price points, each that caps them at a certain number of API calls.Read more of this story at Slashdot.
Could The Next Big Solar Storm Fry the Grid?
Long-time Slashdot reader SonicSpike shared the Washington Post's speculation about the possibility of a gigantic solar storm leaving millions without phone or internet access, and requiring months or years of rebuilding:The odds are low that in any given year a storm big enough to cause effects this widespread will happen. And the severity of those impacts will depend on many factors, including the state of our planet's magnetic field on that day. But it's a near certainty that some form of this catastrophe will happen someday, says Ian Cohen, a chief scientist who studies heliophysics at the Johns Hopkins Applied Physics Laboratory. Long-time Slashdot reader davidwr remains skeptical. "I've only heard of two major events in the last 1300 years, one estimated to be between A. D. 744 and A. D. 993, and the other being the Carrington Event in 1859. But efforts are being made to improve our readiness, reports the Washington Post:To get ahead of this threat, a loose federation of U.S. and international government agencies, and hundreds of scientists affiliated with those bodies, have begun working on how to make predictions about what our Sun might do. And a small but growing cadre of scientists argue that artificial intelligence will be an essential component of efforts to give us advance notice of such a storm... At present, no warning system is capable of giving us more than a few hours' notice of a devastating solar storm. If it's moving fast enough, it could be as little as 15 minutes. The most useful sentinel - a sun-orbiting satellite launched by the U.S. in 2015 - is much closer to Earth than the sun, so that by the time a fast-moving storm crosses its path, an hour or less is all the warning we get. The European Space Agency has proposed a system to help give earlier warning by putting a satellite dubbed Vigil into orbit around the Sun, positioned roughly the same distance from the Earth as the Earth is from the Sun. It could potentially give us up to five hours of warning about an incoming solar storm-enough time to do the main thing that can help preserve electronics: Switch them all off. But what if there were a way to predict this better, by analyzing the data we've got? That's the idea behind a new, AI-powered model recently unveiled by scientists at the Frontier Development Lab - a public-private partnership that includes NASA, the U.S. Geological Survey, and the U.S. Department of Energy. The model uses deep learning, a type of AI, to examine the flow of the solar wind, the usually calm stream of particles that flow outward from our sun and through the solar system to well beyond the orbit of Pluto. Using observations of that solar wind, the model can predict the "geomagnetic disturbance" an incoming solar storm observed by sun-orbiting satellites would cause at any given point on Earth, the researchers involved say. This model can predict just how big the flux of the Earth's magnetic field will be when the solar storm arrives, and thus how big the induced currents in power lines and undersea internet cables will be... Already, the first primitive ancestor of future AI-based solar-weather alert systems is live. The DstLive system, which debuted on the web in December 2022, uses machine learning to take data about the state of Earth's magnetic field and the solar wind and translate both into a single measure for the entire planet, known as DST. Think of it as the Richter scale, but for solar storms. This number is intended to give us an idea of how intense a storm's impact will be on earth, an hour to six hours in advance. Unfortunately, we may not know how useful such systems are until we live through a major solar storm.Read more of this story at Slashdot.
FTX Thief Cashes Out Millions During Bankman-Fried Trial
An anonymous reader quotes a report from the BBC: A thief who stole more than $470 million in cryptocurrency when FTX crashed is trying to cash it out while the exchange's founder is on trial. Sam Bankman-Fried's high-profile court case began last week. The former crypto mogul denies fraud. After lying dormant for nine months, experts say $20 million of the stolen stash is being laundered into traditional money every day. New analysis shows how the mystery thief is trying to hide their tracks. [...] On the day FTX collapsed, hundreds of millions of dollars of cryptocurrency controlled by the exchange were stolen by an unidentified thief that is believed to still have control of the funds. No one knows how the thief -- or thieves -- was able to get digital keys to FTX crypto wallets, but it is thought it was either an insider or a hacker who was able to steal the information. The criminal moved 9,500 Ethereum coins, then worth $15.5 million, from a wallet belonging to FTX, to a new wallet. Over the next few hours, hundreds of other cryptoassets were taken from the company's wallets, in transactions eventually totaling $477 million. According to researchers from Elliptic, a cryptocurrency investigation firm, the thief lost more than $100 million in the weeks following the hack as some was frozen or lost in processing fees as they frantically moved the funds around to evade capture. But by December around $70 million was successfully sent to a cryptocurrency mixer -- a criminal service used to launder Bitcoin, making it difficult to trace. [...] Although mixers make it difficult to trace Bitcoin, Elliptic was able to follow a small amount of the funds -- $4 million -- that was sent to an exchange. The rest of the stolen FTX stash -- around $230 million -- remained untouched until 30 September -- the weekend before Mr Bankman-Fried's trial began. Nearly every day since then chunks worth millions have been sent to a mixer for laundering and then presumably cashing out. Elliptic has been able to trace $54 million of Bitcoin being sent to the Sinbad mixer after which the trail has gone cold for now. "Crypto launderers have been known to wait for years to move and cash out assets once public attention has dissipated, but in this case they have begun to move just as the world's attention is once again directed towards FTX and the events of November 2022," said Tom Robinson, Elliptic's co-founder.Read more of this story at Slashdot.
Audit Calls NASA's Goal To Reduce Artemis Rocket Costs 'Highly Unrealistic,' Threat To Deep Space Exploration
Richard Tribou reports via Phys.Org: NASA's goal to reduce the costs of the powerful Space Launch System rocket for its Artemis program by 50% was called "highly unrealistic" and a threat to its deep space exploration plans, according to a report by NASA's Office of the Inspector General released (PDF) on Thursday. The audit says the costs to produce one SLS rocket through its proposed fixed-cost contract will still top $2.5 billion, even though NASA thinks it can shrink that through "workforce reductions, manufacturing and contracting efficiencies, and expanding the SLS's user base." "Given the enormous costs of the Artemis campaign, failure to achieve substantial savings will significantly hinder the sustainability of NASA's deep space human exploration efforts," the report warns. The audit looked at NASA's plans to shift from its current setup among multiple suppliers for the hardware to a sole-sourced services contract that would include the production, systems integration and launch of at least five SLS flights beginning with Artemis V currently slated for as early as 2029. NASA's claim it could get those costs to $1.25 billion per rocket was taken to task by the audit. "NASA's aspirational goal to achieve a cost savings of 50% is highly unrealistic. Specifically, our review determined that cost saving initiatives in several SLS production contracts were not significant," the audit reads. It does find that rocket costs could approach $2 billion through the first 10 SLS rockets under the new contract, a reduction of 20%. [...] Through 2025, the audit stated its Artemis missions will have topped $93 billion, which includes billions more than originally announced in 2012 as years of delays and cost increases plagued the leadup to Artemis I. The SLS rocket represents 26% of that cost to the tune of $23.8 billion. The inspector general makes several recommendations to NASA. The most striking of which is that NASA consider using commercial heavy-lift rockets, such as SpaceX's Starship and Super Heavy or Blue Origin's New Glenn, as an alternative to the SLS rocket for future Artemis missions. "Although the SLS is the only launch vehicle currently available that meets Artemis mission needs, in the next 3 to 5 years other human-rated commercial alternatives that are lighter, cheaper, and reusable may become available," the audit reads. "Therefore, NASA may want to consider whether other commercial options should be a part of its mid- to long-term plans to support its ambitious space exploration goals."Read more of this story at Slashdot.
Hydro Dams Are Struggling To Handle the World's Intensifying Weather
Saqib Rahim reports via Wired: It's been one of the wettest years in California since records began. From October 2022 to March 2023, the state was blasted by 31 atmospheric rivers -- colossal bands of water vapor that form above the Pacific and become firehoses when they reach the West Coast. What surprised climate scientists wasn't the number of storms, but their strength and rat-a-tat frequency. The downpours shocked a water system that had just experienced the driest three years in recorded state history, causing floods, mass evacuations, and at least 22 deaths. Swinging between wet and dry extremes is typical for California, but last winter's rain, potentially intensified by climate change, was almost unmanageable. Add to that the arrival of El Nino, and more extreme weather looks likely for the state. This is going to make life very difficult for the dam operators tasked with capturing and controlling much of the state's water. Like most of the world's 58,700 large dams, those in California were built for yesterday's more stable climate patterns. But as climate change taxes the world's water systems -- affecting rainfall, snowmelt, and evaporation -- it's getting tough to predict how much water gets to a dam, and when. Dams are increasingly either water-starved, unable to maintain supplies of power and water for their communities, or overwhelmed and forced to release more water than desired -- risking flooding downstream. But at one major dam in Northern California, operators have been demonstrating how to not just weather these erratic and intense storms, but capitalize on them. Management crews at New Bullards Bar, built in 1970, entered last winter armed with new forecasting tools that gave unprecedented insight into the size and strength of the coming storms -- allowing them to strategize how to handle the rain. First, they let the rains refill their reservoir, a typical move after a long drought. Then, as more storms formed at sea, they made the tough choice to release some of this precious hoard through their hydropower turbines, confident that more rain was coming. "I felt a little nervous at first," says John James, director of resource planning at Yuba Water Agency in northern California. Fresh showers soon validated the move. New Bullards Bar ended winter with plumped water supplies, a 150 percent boost in power generation, and a clean safety record. The strategy offers a glimpse of how better forecasting can allow hydropower to adapt to the climate age.Read more of this story at Slashdot.
Biden Awards $7 Billion For 7 Hydrogen Hubs In Climate Fight Plan
An anonymous reader quotes a report from Reuters: U.S. President Joe Biden traveled to Philadelphia on Friday to announce the recipients of $7 billion in federal grants across 16 states for the development of seven regional hydrogen hubs, advancing a key part of a plan to decarbonize the U.S. economy. The announcement of the funding to boost manufacturing and blue-collar jobs was held in Pennsylvania -- a state that could decide the 2024 presidential election -- underscoring the power Biden wields as he spends the upcoming months doling out money flowing from his landmark pieces of legislation that remain largely unknown to large swaths of the American public. The seven proposed hubs involving companies ranging from Exxon Mobil to Amazon were selected, with their projects spanning 16 states from Pennsylvania to California. The program is intended to jump-start the production of "clean hydrogen" along with the infrastructure needed to get it to industrial users like steelmakers and cement plants. "I'm here to announce one of the largest advanced manufacturing investments in the history of this nation," Biden said," He noted that the total investment will reach $50 billion when taking into account additional investments from private companies. The hub selections will now kick off a long process that includes multiple phases, from design and development to permitting, financing and construction. "It's not guaranteed that someone selected is even going to make it through negotiations and get awarded the money," said Jason Munster, who was involved in analyzing the projects for the Department of Energy and is now a hydrogen consultant at CleanEpic. The hubs selected will serve the Middle Atlantic, Appalachian, Midwest, Minnesota and Plains states, the Gulf Coast, Pacific Northwest and California. The two largest projects include $1.2 billion each for Texas and California -- the former an oil giant and the other a green energy leader.Read more of this story at Slashdot.
Mathematician Warns US Spies May Be Weakening Next-Gen Encryption
Matthew Sparkes reports via NewScientist: A prominent cryptography expert has told New Scientist that a US spy agency could be weakening a new generation of algorithms designed to protect against hackers equipped with quantum computers. Daniel Bernstein at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for "post-quantum cryptography" (PQC). He also believes that NIST has made errors -- either accidental or deliberate -- in calculations describing the security of the new standards. NIST denies the claims. Bernstein alleges that NIST's calculations for one of the upcoming PQC standards, Kyber512, are "glaringly wrong," making it appear more secure than it really is. He says that NIST multiplied two numbers together when it would have been more correct to add them, resulting in an artificially high assessment of Kyber512's robustness to attack. "We disagree with his analysis," says Dustin Moody at NIST. "It's a question for which there isn't scientific certainty and intelligent people can have different views. We respect Dan's opinion, but don't agree with what he says." Moody says that Kyber512 meets NIST's "level one" security criteria, which makes it at least as hard to break as a commonly used existing algorithm, AES-128. That said, NIST recommends that, in practice, people should use a stronger version, Kyber768, which Moody says was a suggestion from the algorithm's developers. NIST is currently in a period of public consultation and hopes to reveal the final standards for PQC algorithms next year so that organizations can begin to adopt them. The Kyber algorithm seems likely to make the cut as it has already progressed through several layers of selection. Given its secretive nature, it is difficult to say for sure whether or not the NSA has influenced the PQC standards, but there have long been suggestions and rumors that the agency deliberately weakens encryption algorithms. In 2013, The New York Times reported that the agency had a budget of $250 million for the task, and intelligence agency documents leaked by Edward Snowden in the same year contained references to the NSA deliberately placing a backdoor in a cryptography algorithm, although that algorithm was later dropped from official standards.Read more of this story at Slashdot.
Netflix To Open Branded Retail Stores For Some Reason
As reported by Bloomberg, Netflix plans to open a number of brick-and-mortar retail locations, called Netflix House, in 2025. Engadget reports: The stores will sell merchandise based on hit Netflix shows, so you can finally snag that Lincoln Lawyer coffee mug you've always dreamed of. Netflix House establishments will also offer dining and curated live experiences. To the latter point, the two initial locations are going to feature an obstacle course based on Squid Game. This seems to miss the point of the show's brutal satire of modern capitalism, but that's been par for the course since it took the world by storm back in 2021. Netflix House will also boast rotating art installations based on hit shows and live performances to excite fans. Additionally, the in-house restaurant will serve cuisine and drinks originally featured on the streamer's many unscripted food-based reality shows. The menu will range from fast casual to high-end dining. The first two locations should open up in the US some time in 2025, though Netflix hasn't said where, with more global outlets to come at a later date. Why the big global push? Josh Simon, the company's vice president of consumer products, told Bloomberg that its customers "love to immerse themselves in the world of our movies and TV shows, and we've been thinking a lot about how we take that to the next level." [...] The company's still finalizing details regarding menus, locations and just about everything else. It has more than a year, after all, to set up shop.Read more of this story at Slashdot.
Across US, Chinese Bitcoin Mines Draw National Security Scrutiny
According to the New York Times, Chinese-owned bitcoin mining operations in the United States are causing security concerns due to their proximity to important sites and the potential for cyber threats. The Crypto Times reports: There are some mining facilities close to critical sites such as Microsoft data center for Pentagon's Air Force nuclear's missile base in Wyoming USA. Officials in U.S. fear Chinese espionage activities at these places. These mining operations began after China banned bitcoin mining in 2021. These individuals sometimes maintain connections with the Chinese Communist Party or state-owned companies which may be kept concealed through multiple layers of companies. Texas has turned out to be a haven for Chinese-linked Bitcoin mining, with some US states having restrictions but Texas offers incentives. This might pose a threat to the power grid or essential infrastructure. A new concern has recently been raised in a report related to a potential cyber strike on the US infrastructure by China in case a major conflict arose.Read more of this story at Slashdot.
...150151152153154155156157158159...