Google is getting ready to test a new "IP Protection" feature for the Chrome browser that enhances users' privacy by masking their IP addresses using proxy servers. From a report: Recognizing the potential misuse of IP addresses for covert tracking, Google seeks to strike a balance between ensuring users' privacy and the essential functionalities of the web. IP addresses allow websites and online services to track activities across websites, thereby facilitating the creation of persistent user profiles. This poses significant privacy concerns as, unlike third-party cookies, users currently lack a direct way to evade such covert tracking. While IP addresses are potential vectors for tracking, they are also indispensable for critical web functionalities like routing traffic, fraud prevention, and other vital network tasks. The "IP Protection" solution addresses this dual role by routing third-party traffic from specific domains through proxies, making users' IP addresses invisible to those domains. As the ecosystem evolves, so will IP Protection, adapting to continue safeguarding users from cross-site tracking and adding additional domains to the proxied traffic. "Chrome is reintroducing a proposal to protect users against cross-site tracking via IP addresses. This proposal is a privacy proxy that anonymizes IP addresses for qualifying traffic as described above," reads a description of the IP Protection feature. Initially, IP Protection will be an opt-in feature, ensuring users have control over their privacy and letting Google monitor behavior trends.Read more of this story at Slashdot.
Matter -- the IOT connectivity standard with ambitions to fix the smart home and make all of our gadgets talk to each other -- has hit version 1.2, adding support for nine new types of connected devices. From a report: Robot vacuums, refrigerators, washing machines, and dishwashers are coming to Matter, as are smoke and CO alarms, air quality sensors, air purifiers, room air conditioners, and fans. It's a crucial moment for the success of the industry-backed coalition that counts 675 companies among its members. This is where it moves from the relatively small categories of door locks and light bulbs to the real moneymakers: large appliances. The Connectivity Standards Alliance (CSA), the organization behind Matter, released the Matter 1.2 specification this week, a year after launching Matter 1.0, following through on its promise to release two updates a year. Now, appliance manufacturers can add support for Matter to their devices, and ecosystems such as Apple Home, Amazon Alexa, Google Home, and Samsung SmartThings can start supporting the new device types. Yes, this means you should finally be able to control a robot vacuum in the Apple Home app -- not to mention your wine fridge, dishwasher, and washing machine. The initial feature set for the new device types includes basic function controls (start / stop, change mode) and notifications -- such as the temperature of your fridge, the status of your laundry, or whether smoke is detected. Robot vacuum support is robust -- remote start and progress notifications, cleaning modes (dry vacuum, wet mopping), and alerts for brush status, error reporting, and charging status. But there's no mapping, so you'll still need to use your vacuum app if you want to tell the robot where to go.Read more of this story at Slashdot.
An anonymous reader shares a report: In 2020, scientists decided just to rework the alphanumeric symbols they used to represent genes rather than try to deal with an Excel feature that was interpreting their names as dates and (un)helpfully reformatting them automatically. Last week, a member of the Excel team posted that the company is rolling out an update on Windows and macOS to fix that. Excel's automatic conversions are intended to make it easier and faster to input certain types of commonly entered data -- numbers and dates, for instance. But for scientists using quick shorthand to make things legible, it could ruin published, peer-reviewed data, as a 2016 study found. Microsoft detailed the update in a blog post last week, adding a checkbox labeled "Convert continuous letters and numbers to a date." You can probably guess what that toggles. The update builds on the Automatic Data Conversions settings the company added last year, which included the option for Excel to warn you when it's about to get extra helpful and let you load your file without automatic conversion so you can ensure nothing will be screwed up by it.Read more of this story at Slashdot.
An anonymous reader shares a report: Apple was caught flat-footed when ChatGPT and other AI tools took the technology industry by storm. But the company is now preparing its response and plans to develop features for its full range of devices. One of the most intense and widespread endeavors at Apple right now is its effort to respond to the AI frenzy sweeping the technology industry. The company has some catching up to do. Apple largely sat on the sidelines when OpenAI's ChatGPT took off like a rocket last year. It watched as Google and Microsoft rolled out generative AI versions of their search engines, which spit out convincingly human-like responses to users' queries. Microsoft also updated its Windows apps with smarter assistants, and Amazon unveiled an AI-enhanced overhaul of Alexa. All the while, the only noteworthy AI release from Apple was an improved auto-correct system in iOS 17. Apple's senior vice presidents in charge of AI and software engineering, John Giannandrea and Craig Federighi, are spearheading the effort. On Cook's team, they're referred to as the "executive sponsors" of the generative AI push. Eddy Cue, the head of services, is also involved, I'm told. The trio are now on course to spend about $1 billion per year on the undertaking. Giannandrea is overseeing development of the underlying technology for a new AI system, and his team is revamping Siri in a way that will deeply implement it. This smarter version of Siri could be ready as soon as next year, but there are still concerns about the technology and it may take longer for Apple's AI features to spread across its product line. Federighi's software engineering group, meanwhile, is adding AI to the next version of iOS. There's an edict to fill it with features running on the company's large language model, or LLM, which uses a flood of data to hone AI capabilities. The new features should improve how both Siri and the Messages app can field questions and auto-complete sentences, mirroring recent changes to competing services.Read more of this story at Slashdot.
Martin Goetz, who joined the computer industry in its infancy in the mid-1950s as a programmer working on Univac mainframes and who later received the first U.S. patent for software, died on Oct. 10 at his home in Brighton, Mass. He was 93. The New York Times: His daughter Karen Jacobs said the cause was leukemia. In 1968, nearly a decade after he and several other partners started the company Applied Data Research, Mr. Goetz received his patent, for data-sorting software for mainframes. It was major news in the industry: An article in Computerworld magazine bore the headline "First Patent Is Issued for Software, Full Implications Are Not Known." Until then, software had not been viewed as a patentable product, one that was bundled into hulking mainframes like those made by IBM. Ms. Jacobs said her father had patented his own software so that IBM could not copy it and put it on its machines. "By 1968, I had been involved in arguing about the patentability of software for about three years," Mr. Goetz said in an oral history interview in 2002 for the University of Minnesota. "I knew at some point in time the patent office would recognize it." What Mr. Goetz called his "sorting system" is believed to have been the first software product to be sold commercially, and his success at securing a patent led him to become a vocal champion of patenting software. The programs that instruct computers on what to do, he said, were often as worthy of patents as the machines themselves. The issuance of Mr. Goetz's patent "helped managers, programmers and lawyers at young software firms feel as if they were forming an industry of their own -- one in which they were creating products that were potentially profitable and legally defensible as proprietary inventions," Gerardo Con Diaz, a professor of science and technology studies at the University of California, Davis, wrote in the 2019 book "Software Rights: How Patent Law Transformed Software Development." Further reading, from Slashdot archive: Recipient of First Software Patent Defends Them (2009).Read more of this story at Slashdot.
Japan's antitrust watchdog has begun an investigation into whether Alphabet's Google abuses its market position to block rival services, compounding scrutiny of the internet leader's business practices across the globe. From a report: The country's Fair Trade Commission has begun a probe centered on allegations of potential antitrust violations, an official with the agency said, confirming a Nikkei report. It plans to solicit information and views on the matter from the public, the official added. The agency plans to examine whether Google inappropriately asked smartphone makers to prioritize its search services on their devices. The Japanese investigation marked the first time the commission has consulted with third parties from the outset of an individual probe, agency officials told reporters in Tokyo. The probe could widen to include Android phone makers found to be complicit in antitrust activity, an official said, without elaborating. Japan's review comes on top of an antitrust case the US has mounted against the global search leader. Federal regulators accuse Google of abusing its dominance to block startups and larger rivals such as Microsoft, a key argument in the biggest tech anti-monopoly case since the 1990s.Read more of this story at Slashdot.
A Harvard professor on the history of science looks at our response to the pandemic, criticizing "a report that gave the false impression that masking didn't help." From Scientific American:The group's report was published by Cochrane, an organization that collects databases and periodically issues "systematic" reviews of scientific evidence relevant to health care. This year it published a paper addressing the efficacy of physical interventions to slow the spread of respiratory illness such as COVID... The review of studies of masking concluded that the "results were inconclusive..." [and] it was "uncertain whether wearing [surgical] masks or N95/P2 respirators helps to slow the spread of respiratory viruses." Still, the authors were also uncertain about that uncertainty, stating that their confidence in their conclusion was "low to moderate." You can see why the average person could be confused... The Cochrane finding was not that masking didn't work but that scientists lacked sufficient evidence of sufficient quality to conclude that they worked... Cochrane has made this mistake before. In 2016 a flurry of media reports declared that flossing your teeth was a waste of time... The answer demonstrates a third issue with the Cochrane approach: how it defines evidence. The organization states that its reviews "identify, appraise and synthesize all the empirical evidence that meets pre-specified eligibility criteria." The problem is what those eligibility criteria are. Cochrane Reviews base their findings on randomized controlled trials (RCTs), often called the "gold standard" of scientific evidence. But many questions can't be answered well with RCTs, and some can't be answered at all... In fact, there is strong evidence that masks do work to prevent the spread of respiratory illness. It just doesn't come from RCTs. It comes from Kansas. In July 2020 the governor of Kansas issued an executive order requiring masks in public places. Just a few weeks earlier, however, the legislature had passed a bill authorizing counties to opt out of any statewide provision. In the months that followed, COVID rates decreased in all 24 counties with mask mandates and continued to increase in 81 other counties that opted out of them... Cochrane ignored this epidemiological evidence because it didn't meet its rigid standard. I have called this approach "methodological fetishism," when scientists fixate on a preferred methodology and dismiss studies that don't follow it. Sadly, it's not unique to Cochrane. By dogmatically insisting on a particular definition of rigor, scientists in the past have landed on wrong answers more than once. Vox also points out that while Cochrane's review included 78 studies, "only six were actually conducted during the Covid-19 pandemic... Instead, most of them looked at flu transmission in normal conditions, and many of them were about other interventions like hand-washing. "Only two of the studies are about Covid and masking in particular. Furthermore, neither of those studies looked directly at whether people wear masks, but instead at whether people were encouraged or told to wear masks by researchers."Read more of this story at Slashdot.
"New data suggests that what a lot of people do most often in their car is listen to AM/FM radio," writes 9to5Mac. "Yes, it's 2023, and you might think AM/FM radio is on the way out, but new data show that to not be the case for a lot of people..." The market research company Edison Research used one-day listening diarires (for Americans older than 13) to measure the amount of time spent listening to audio - then compared results for those with and without an in-car entertainment system. Those without an in-car entertainment system spent 67% of their time listening to AM/FM radio - with the rest listening to Sirius XM (12%), a streaming service (9%), or podcasts (4%). But among those with an in-car entertainment system... 46% still listened to AM/FM radio. Less than a fifth listened to Sirus XM (19%), a streaming service (18%), or podcasts (7%). The researchers' conclusion? "Even those with these systems choose AM/FM for nearly half of their in-car listening. For many people, even with so many new options, radio and the in-car environment continue to just go together."Read more of this story at Slashdot.
One 80-year-old retired teacher in Los Angeles lost $69,000 in bitcoin to scammers. And 46,000 people lost over $1 billion to crypto scams since 2021 (according to America's Federal Trade Commission). Now the Los Angeles Times reports California's new moves against scammers using bitcoin ATMs, with a bill one representative says "is about ensuring that people who have been frauded in our communities don't continue to watch our state step aside when we know that these are real problems that are happening."Starting in January, California will limit cryptocurrency ATM transactions to $1,000 per day per person under Senate Bill 401, which Gov. Gavin Newsom signed into law. Some bitcoin ATM machines advertise limits as high as $50,000... Victims of bitcoin ATM scams say limiting the transactions will give people more time to figure out they're being tricked and prevent them from using large amounts of cash to buy cryptocurrency. But crypto ATM operators say the new laws will harm their industry and the small businesses they pay to rent space for the machines. There are more than 3,200 bitcoin ATMs in California, according to Coin ATM Radar, a site that tracks the machines' locations. "This bill fails to adequately address how to crack down on fraud, and instead takes a punitive path focused on a specific technology that will shudder the industry and hurt consumers, while doing nothing to stop bad actors," said Charles Belle, executive director of the Blockchain Advocacy Coalition... Law enforcement has cracked down on unlicensed crypto ATMs, but it can be tough for consumers to tell how serious the industry is about addressing the concerns. In 2020, a Yorba Linda man pleaded guilty to charges of operating unlicensed bitcoin ATMs and failing to maintain an anti-money-laundering program even though he knew criminals were using the funds. The illegal business, known as Herocoin, allowed people to buy and sell bitcoin in transactions of up to $25,000 and charged a fee of up to 25%. So there's also provisions in the law against exorbitant fees:The new law also bars bitcoin ATM operators from collecting fees higher than $5 or 15% of the transaction, whichever is greater, starting in 2025. Legislative staff members visited a crypto kiosk in Sacramento and found markups as high as 33% on some digital assets when they compared the prices at which cryptocurrency is bought and sold. Typically, a crypto ATM charges fees between 12% and 25% over the value of the digital asset, according to a legislative analysis... Another law would by July 2025 require digital financial asset businesses to obtain a license from the California Department of Financial Protection and Innovation.Read more of this story at Slashdot.
"It's not every day that you get to update the firmware on a device that was produced in the 1970s," writes Hackaday, "and rarely is said device well beyond the boundaries of our solar system. "This is however exactly what the JPL team in charge of the Voyager 1 & 2 missions are facing, as they are in the process of sending fresh firmware patches over to these amazing feats of engineering." From NASA's announcement:One effort addresses fuel residue that seems to be accumulating inside narrow tubes in some of the thrusters on the spacecraft. The thrusters are used to keep each spacecraft's antenna pointed at Earth. This type of buildup has been observed in a handful of other spacecraft... In some of the propellant inlet tubes, the buildup is becoming significant. To slow that buildup, the mission has begun letting the two spacecraft rotate slightly farther in each direction [almost 1 degree] before firing the thrusters. This will reduce the frequency of thruster firings... While more rotating by the spacecraft could mean bits of science data are occasionally lost - akin to being on a phone call where the person on the other end cuts out occasionally - the team concluded the plan will enable the Voyagers to return more data over time. Engineers can't know for sure when the thruster propellant inlet tubes will become completely clogged, but they expect that with these precautions, that won't happen for at least five more years, possibly much longer. "This far into the mission, the engineering team is being faced with a lot of challenges for which we just don't have a playbook," said Linda Spilker, project scientist for the mission as NASA's Jet Propulsion Laboratory in Southern California. "But they continue to come up with creative solutions." But that's not the only issue:The team is also uploading a software patch to prevent the recurrence of a glitch that arose on Voyager 1 last year. Engineers resolved the glitch, and the patch is intended to prevent the issue from occurring again in Voyager 1 or arising in its twin, Voyager 2... In 2022, the onboard computer that orients the Voyager 1 spacecraft with Earth began to send back garbled status reports, despite otherwise continuing to operate normally... The attitude articulation and control system (AACS) was misdirecting commands, writing them into the computer memory instead of carrying them out. One of those missed commands wound up garbling the AACS status report before it could reach engineers on the ground. The team determined the AACS had entered into an incorrect mode; however, they couldn't determine the cause and thus aren't sure if the issue could arise again. The software patch should prevent that. "This patch is like an insurance policy that will protect us in the future and help us keep these probes going as long as possible," said JPL's Suzanne Dodd, Voyager project manager. "These are the only spacecraft to ever operate in interstellar space, so the data they're sending back is uniquely valuable to our understanding of our local universe." Since their launch in 1977, NASA's two Voyager probes have travelled more than 12 billion miles (each!), and are still sending back data from beyond our solar system.Read more of this story at Slashdot.
"In 2023, the state of our digital privacy is: Very Creepy." That's the verdict from Mozilla's first-ever "Annual Consumer Creep-o-Meter," which attempts to set benchmarks for digital privacy and identify trends:Since 2017, Mozilla has published 15 editions of *Privacy Not Included, our consumer tech buyers guide. We've reviewed over 500 gadgets, apps, cars, and more, assessing their security features, what data they collect, and who they share that data with. In 2023, we compared our most recent findings with those of the past five years. It quickly became clear that products and companies are collecting more personal data than ever before - and then using that information in shady ways... Products are getting more secure, but also a lot less private. More companies are meeting Mozilla's Minimum Security Standards like using encryption and providing automatic software updates. That's good news. But at the same time, companies are collecting and sharing users' personal data like never before. And that's bad news. Many companies now view their hardware or software as a means to an end: collecting that coveted personal data for targeted advertising and training AI. For example: The mental health app BetterHelp shares your data with advertisers, social media platforms, and sister companies. The Japanese car manufacturer Nissan collects a wide range of information, including sexual activity, health diagnosis data, and genetic information - but doesn't specify how. An increasing number of products can't be used offline. In the past, the privacy conscious could always buy a connected device but turn off connectivity, making it "dumb." That's no longer an option in many cases. The number of connected devices that require apps and can't be used offline are increasing. This trend, coupled with the first, means it's harder and harder to keep your data private. Privacy policies also need improvement. "Legalese, ambiguity, and policies that sprawl across multiple documents and URLs are the status quo. And it's getting worse, not better. Companies use these policies as a shield, not an actual resource for consumers." They note that Toyota has more than 10 privacy policy documents, and that it would actually take five hours to read all the privacy documents the Meta Quest Pro VR headset. In the end they advise opting out of data collection when possible, enabling security features, and "If you're not comfortable with a product's privacy, don't buy it. And, speak up. Over the years, we've seen companies respond to consumer demand for privacy, like when Apple reformed app tracking and Zoom made end-to-end encryption a free feature." You can also take a quiz that calculates your own privacy footprint (based on whether you're using consumer tech products like the Apple Watch, Nintendo Switch, Nook, or Telegram). Mozilla's privacy advocates award the highest marks to privacy-protecting products like Signal, Sonos' SL Speakers, and the Pocketbook eReader (an alternative to Amazon's Kindle. (Although 100% of the cars reviewed by Mozilla "failed to meet our privacy and security standards.") The graphics on the site help make its point. As you move your mouse across the page, the cartoon eyes follow its movement...Read more of this story at Slashdot.
"Twenty phone companies may soon have all their voice calls blocked by US carriers," reports Ars Technica, "because they didn't submit real plans for preventing robocalls on their networks."The 20 carriers include a mix of US-based and foreign voice service providers that submitted required "robocall mitigation" plans to the Federal Communications Commission about two years ago. The problem is that some of the carriers' submissions were blank pages and others were bizarre images or documents that had no relation to robocalls. The strange submissions, according to FCC enforcement orders issued Monday, included "a .PNG file depicting an indiscernible object," a document titled "Windows Printer Test Page," an image "that depicted the filer's 'Taxpayer Profile' on a Pakistani government website," and "a letter that stated: 'Unfortunately, we do not have such a documents.'" Monday's FCC announcement said the agency's Enforcement Bureau issued orders demanding that "20 non-compliant companies show cause within 14 days as to why the FCC should not remove them from the database for deficient filings." The orders focus on the certification requirements and do not indicate whether these companies carry large amounts of robocall traffic. Each company will be given "an opportunity to cure any deficiencies in its robocall mitigation program description or explain why its certification is not deficient." After the October 30 deadline, the companies could be removed from the FCC's Robocall Mitigation Database. Removal from the database would oblige other phone companies to block all of their calls.Read more of this story at Slashdot.
"Mark Zuckerberg is making good on his promise to accelerate the use of Threads," reports Business Insider:The Meta CEO insisted in July that the app was not in its final form. "I'm highly confident that we're gonna be able to pour enough gasoline on this to help it grow," Zuckerberg said. Since then, Threads has rolled out a host of major new features, including a web version, keyword search, voice posts, and the ability to edit posts, even as it avoids promoting news. Smaller things, too, like being able to follow updates in individual threads at the tap of a bell icon, a way to mass follow people mentioned in a post, and even tag people's Instagram accounts, are now available... More Threads features are said to be on the way, like polls. But Insider also reports that "As the app has matured quickly in recent weeks, users have started to return and downloads have continued to rise."So far in October, Threads has hovered around 33 million daily active users and 120 million monthly active users, according to data from Apptopia, up from about 25 million daily users and 100 million monthly users in July... Since the app launched on July 6, it's been downloaded 260 million times, Apptopia data shows, with downloads in September almost double the downloads in August... Although the entire team working on Threads remains small by Meta standards, around 50 people, the company was surprised by the interest in the app and "really wants it to work," an employee said. To that end, Threads is now being integrated to an extent with Facebook and Instagram, two of the most popular apps in the world. There is a direct link to Threads on each user's Instagram page, a post on Threads can be sent in Instagram DMs, and as of this week, Threads is being promoted within the Instagram app feed via a small carousel of select posts under the header "Threads for you...." It's not just Instagram, according to BGR. "If you've been posting some especially strange messages Threads, thinking that only the few people who follow you will see them, I have some bad news for you..."As spotted by TechCrunch, users on Facebook have noticed something new on their News Feed: content from Threads. It appears that Meta is now showing Facebook users a new "For You from Threads" section on the News Feed that contains recommended content from the sibling social media platform.Read more of this story at Slashdot.
In August the Fukushima Daiichi nuclear power plant started releasing treated radioactive wastewater into the sea - a process they plan to continue for decades. Now the International Atomic Energy Agency has sent a team to sample the water near the plant. And the Associated Press reports that a team member "said Thursday he does not expect any rise in radiation levels in the fish caught in the regional seas."The IAEA team watched flounder and other popular kinds of fish being caught off the coast earlier Thursday and brought on boats to the Hisanohama port in southern Fukushima for an auction. "I can say that we don't expect to see any change starting in the fish," said Paul McGinnity, an IAEA marine radiology scientist. A small rise in the levels of tritium, which cannot be removed from the Fukushima Daiichi wastewater by the plant's treatment system called ALPS, is possible in locations close to the discharge points, but the levels of radioactivity are expected to be similar to those measured before the discharge last year, he said... The IAEA has reviewed the safety of the wastewater release and concluded in July that if carried out as planned, it would have a negligible impact on the environment, marine life and human health. During the Oct. 16-23 visit, the IAEA team also inspected the collection and processing of seawater and marine sediment near the plant... The sampling work will be followed by a separate IAEA task force that will review the safety of the treated radioactive water... Tokyo Electric Power Company and the government say discharging the water into the sea is unavoidable because the tanks will reach their 1.37 million-ton capacity next year and space at the plant will be needed for its decommissioning, which is expected to take decades, if it is achievable at all. They say the water is treated to reduce radioactive materials to safe levels, and then is diluted with seawater by hundreds of times to make it much safer than international standards. Some experts say such long-term release of low-dose radioactivity is unprecedented and requires close monitoring.Read more of this story at Slashdot.
The BBC writes: Super Mario Bros: Wonder is a psychedelic take on the traditional 2D platformer that jazzes up Mario's usual Bowser-thwarting adventure with Wonder Effects that, as Polygon's Chris Plante put it, sees "the levels themselves collapse and contort, disobeying the laws established by decades of Mario games". It's as if developers unearthed the "stuffed notebook of chaos" of every wacky idea ever rejected from the series and turned it into a single game, Plante said... [T]he game offers "so many different looks and wild hooks that the typically forgettable story simply didn't matter," said IGN's Ryan McCaffrey, who enthused: "Every frame oozes joy...." The Guardian's Keza McDonald says the game carries the sort of fun expected by Mario fans, "but with enough novelty and unexpected twists to prevent it from feeling over-familiar", and at the same time for newcomers "is a wonderful introduction to the fizzy creativity and attention to detail that has made Mario a family staple". This is the first time the Mario developers have delved into online multiplayer in the traditional 2D space, where previously co-op play required players to share a console in person. "It feels more like you're working together," McDonald said. "Characters can revive one another if someone falls foul of a Bullet Bill or flaming pit, making the game much easier to get through as a team." GamesRadar's Sam Loveridge added "There's also an attention to detail here that just heightens that magic playfulness. There's so much to spot, whether it's the snot bubble on a sleeping Goomba or the fact each character's face changes when they start dashing." Although Kotaku has a suggestion. "Before you get too ahead of yourself turning Mario and company into giant elephants and whatnot, you should mess around with some gameplay settings first - especially the one that controls the Talking Flowers."Earlier this week, in another edition of Nintendo's ongoing web series, Ask the Developer, we learned that Wonder was originally going to have a live commentary feature like what you'd find in a sports game. It was scrapped, but found new life through the game's Talking Flowers characters who shout at Mario and crew whenever they walk by. Although the Talking Flowers are a cute addition to the game and make solo playthroughs a little less lonely, your mileage with them may vary. Some people think the Talking Flowers, who talk all the time, are pretty annoying, if you can believe that.Read more of this story at Slashdot.
Last week the Linux Foundation announced its Civil Infrastructure Platform project "has expanded its super-long-term stable kernel program with a 6.1-based series. "Just like for the previously started kernel series (4.4-cip, 4.19-cip and 5.10-cip), the project is committed to maintaining the 6.1-cip kernel for a minimum of 10 years after its initial release."The Civil Infrastructure Platform project is establishing an open source base layer of industrial grade Linux to enable the use and implementation of software building blocks for civil infrastructure. The project's kernels are maintained like regular long-term-stable kernels, and developers of the CIP kernel are also involved in long-term-stable kernel review and testing. While regular long-term-stable kernels are moving back to 2 years maintenance, CIP kernels are set up for 10 years. In order to enable this extended lifetime, CIP kernels are scoped-down in actively supported kernel features and target architecture. At the same time, CIP kernels accept non-invasive backports from newer mainline kernels that enable new hardware... "The CIP kernels are developed and reviewed with the same meticulous attention as regular Long-Term-Stable kernels," said Yoshi Kobayashi, Technical Steering Committee Chair at the CIP project. "Our developers actively participate in reviewing and testing long-term-stable kernels, contributing to the overall quality and security of the platform. A key highlight is our work on the IEC 62443 security standard, aimed at fortifying the resilience of critical infrastructure systems." "As 2023 comes to a close, the CIP project has stood as a beacon of stability and innovation, with a commitment to driving collaboration to strengthen this essential initiative," said Urs Gleim, Governing Board Chair at the CIP project... The Civil Infrastructure Platform is driving open source collaboration and innovation around industrial grade software for products used in industrial automation and for civil infrastructure, such as trains and power grids. To learn more about the CIP project, including how to get involved and contribute, please visit our booth at the Linux Foundation Open Source Summit Japan, December 5 - 6, or visit our website.Read more of this story at Slashdot.
An anonymous reader shared this report from the Washington Post:For years, tech companies like Open AI have freely used news stories to build data sets that teach their machines how to recognize and respond fluently to human queries about the world. But as the quest to develop cutting-edge AI models has grown increasingly frenzied, newspaper publishers and other data owners are demanding a share of the potentially massive market for generative AI, which is projected to reach to $1.3 trillion by 2032, according to Bloomberg Intelligence. Since August, at least 535 news organizations - including the New York Times, Reuters and The Washington Post - have installed a blocker that prevents their content from being collected and used to train ChatGPT. Now, discussions are focused on paying publishers so the chatbot can surface links to individual news stories in its responses, a development that would benefit the newspapers in two ways: by providing direct payment and by potentially increasing traffic to their websites. In July, Open AI cut a deal to license content from the Associated Press as training data for its AI models. The current talks also have addressed that idea, according to two people familiar with the talks who spoke on the condition of anonymity to discuss sensitive matters, but have concentrated more on showing stories in ChatGPT responses. Other sources of useful data are also looking for leverage. Reddit, the popular social message board, has met with top generative AI companies about being paid for its data, according to a person familiar with the matter, speaking on the condition of anonymity to discuss private negotiations. If a deal can't be reached, Reddit is considering blocking search crawlers from Google and Bing, which would prevent the forum from being discovered in searches and reduce the number of visitors to the site. But the company believes the trade-off would be worth it, the person said, adding: "Reddit can survive without search." "The moves mark a growing sense of urgency and uncertainty about who profits from online information," the article argues. "With generative AI poised to transform how users interact with the internet, many publishers and other companies see fair payment for their data as an existential issue." They also cite James Grimmelmann, a professor of digital and information law at Cornell University, who suggests Open AI's decision to negotiate "may reflect a desire to strike deals before courts have a chance weigh in on whether tech companies have a clear legal obligation to license - and pay for - content."Read more of this story at Slashdot.
"Planetary scientists want to search for biosignatures in what they believe was once a Martian mud lake," reports Space.com:After scientists carefully studied what they believe are desiccated remnants of an equatorial mud lake on Mars, their study of Hydraotes Chaos suggests a buried trove of water surged onto the surface. If researchers are right, then this flat could become prime ground for future missions seeking traces of life on Mars...More generally, scientists suggest surface water on Mars froze over about 3.7 billion years ago as the atmosphere thinned and the surface cooled. But underground, groundwater might still have remained liquid in vast chambers. Moreover, life forms might have abided in those catacombs - leaving behind traces of their existence. Only around 3.4 billion years ago did that system of aquifers break down in Hydraotes Chaos, triggering floods of epic proportions that dumped mountains' worth of sediment onto the surface, the study suggests. Future close-up missions could someday examine that sediment for biosignatures... Alexis Rodriguez, a senior scientist at the Planetary Science Institute in Arizona, and his colleagues pored over images of Hydraotes Chaos taken by NASA's Mars Reconnaissance Orbiter in search of more clues. In the midst of the chaos terrain's maelstrom lies a calm circle of relatively flat ground. This plain is pockmarked with cones and domes, with hints of mud bubbling from below - suggesting that sediment did not arrive via a rushing flash flood, but instead rose from underneath. Based on simulations, the authors suggest Hydraotes Chaos overlaid a reservoir of buried biosignature-rich water - potentially in the form of thick ice sheets. Ultimately - potentially from the Red Planet's internal heat melting the ice - that water bubbled up to the surface and created a muddy lake. As the water dissipated, it would have left behind all those tantalizing biosignatures. Curiously, that water might have remained underground even after those megafloods. In fact, the authors' results suggest the sediment on the surface of this mud lake dates from only around 1.1 billion years ago: long after most of Mars's groundwater ought to have flooded out, and certainly long after Mars was habitable. With that timeline in mind, Rodriguez and colleagues plan to analyze what lies under the surface of the lake. That, Rodriguez tells Space.com, would allow scientists to establish when in Martian history the planet might have hosted life. Rodriguez tells Space.com that this region is now "under consideration" for testing with an under-development NASA instrument called Extractor for Chemical Analysis of Lipid Biomarkers in Regolith (EXCALIBR) - that could test extraterrestrial rocks for biomarkers like lipids.Read more of this story at Slashdot.
"No roof, no solar power. That has been the dispiriting equation shutting out roughly half of all Americans from plugging into the sun," writes the Washington Post's "Climate Coach" column. "But signing up for solar soon might be as easy as subscribing to Netflix."Scores of new small solar farms that sell clean, local electricity directly to customers are popping up. The setup, dubbed "community solar," is designed to bring solar power to people who don't own their own homes or can't install panels - often at prices below retail electricity rates... At least 22 states have passed legislation encouraging independent community solar projects, but developers are just beginning to expand. Most existing projects are booked. At the moment, community solar projects in the United States generate enough electricity to power about 918,000 homes - less than 1 percent of total households, according to the Solar Energy Industries Association, a nonprofit trade group. But as more states join, and the Environmental Protection Agency's "Solar for All" program pours billions into federal solar power grants, more Americans will get the chance... While projects exist in most states, they are highly concentrated: More than half are in Massachusetts, Minnesota and New York. These might be on a condo roof, or on open land like the 10-MW Fresno community solar farm, on a city-owned plot surrounded by agricultural land. Most are small: 2 megawatts of capacity on average, about enough to power 200 to 400 homes... The renewable energy marketplace EnergySage and the nonprofit Solar United Neighbors connect customers to community solar projects in their region. People generally receive monthly credits for electricity produced by their share of solar panels. These are subtracted from their total electricity bill or credited on future bills... Subscribers on average save about 10 percent on their utility bill (the range is 5 percent to 15 percent). These economics are propelling the industry to record heights. Between 2016 and 2019, community solar capacity more than quadrupled to 1.4 gigawatts. By the end of this year, energy research firm Wood Mackenzie estimates, there will be 6 GW of community solar. And the Energy Department wants to see community solar reach 5 million households by 2025. "The economics are strongly on the side of doing this," says Dan Kammen, an energy professor at the University of California at Berkeley. "It's now cheaper to build new solar than to operate old fossil [fuel plants]. ... We're at the takeoff point." The article notes "solar for renters" saves about $100 per year for the average ratepayer (while rooftop solar arrays may save homeowners over $1,000 annually). But according to the article, the arrangement still "reflects a new reality... "Solar energy prices are falling as private and public money, and new laws, are fueling a massive expansion of small-scale community solar projects."Read more of this story at Slashdot.
"Dropbox said Friday that it's agreed to return over one quarter of its San Francisco headquarters to the landlord," reports CNBC, "as the commercial real estate market continues to soften following the Covid pandemic." The article notes that last year Dropbox's accountants declared a $175.2 million "impairment" on the office - a permanent reduction in its value - calling it "a result of adverse changes" in the market. And the year before they announced another $400 million charge "related to real estate assets." Friday CNBC reported: In a filing, Dropbox said it agreed to surrender to its landlord 165,244 square feet of space and pay $79 million in termination fees. Under the amendment to its lease agreement, Dropbox will offload the space over time through the first quarter of 2025. Since going remote during the pandemic three years ago, Dropbox has been trying to figure out what to do with much of the 736,000 square feet of space in Mission Bay it leased in 2017, in what was the largest office lease in the city's history. The company subleased closed to 134,000 square feet of space last year to Vir Biotechnology, leaving it with just over 604,000 square feet... "As we've noted in the past, we've taken steps to de-cost our real estate portfolio as a result of our transition to Virtual First, our operating model in which remote work is the primary experience for our employees, but where we still come together for planned in-person gatherings," a company spokesperson told CNBC in an emailed statement... Dropbox's 2017 lease for the brand new headquarters was for 15 years... "As a result of the amendment the company will avoid future cash payments related to rent and common area maintenance fees of $137 million and approximately $90 million, respectively, over the remaining 10 year lease term," Dropbox said in Friday's filing. A short walk away from Dropbox, Uber has been trying to sublease part of its headquarters. The article also notes that San Francisco's office vacancy rate "stood at 30% in the third quarter, the highest level since at least 2007, according to city data."Read more of this story at Slashdot.
The Washington Post tells the story of a veteran political operative and a former army intelligence officer hired to help keep in power the president of the west African nation Burkina Faso:Their company, Percepto International, was a pioneer in what's known as the disinformation-for-hire business. They were skilled in deceptive tricks of social media, reeling people into an online world comprised of fake journalists, news outlets and everyday citizens whose posts were intended to bolster support for [president Roch Marc] Kabore's government and undercut its critics. But as Percepto began to survey the online landscape across Burkina Faso and the surrounding French-speaking Sahel region of Africa in 2021, they quickly saw that the local political adversaries and Islamic extremists they had been hired to combat were not Kabore's biggest adversary. The real threat, they concluded, came from Russia, which was running what appeared to be a wide-ranging disinformation campaign aimed at destabilizing Burkina Faso and other democratically-elected governments on its borders. Pro-Russian fake news sites populated YouTube and pro-Russian groups abounded on Facebook. Local influencers used WhatsApp and Telegram groups to organize pro-Russian demonstrations and praise Russian President Vladimir Putin. Facebook fan pages even hailed the Wagner Group, the Russian paramilitary network run by Yevgeniy Prigozhin, the late one-time Putin ally whose Internet Research Agency launched a disinformation campaign in the United States to influence the 2016 presidential election... Percepto didn't know the full scope of the operation it had uncovered but it warned Kabore's government that it needed to move fast: Launch a counteroffensive online - or risk getting pushed out in a coup. Three years later, the governments of five former French colonies, including Burkina Faso, have been toppled. The new leaders of two of those countries, Mali and Burkina Faso, are overtly pro-Russian; in a third, Niger, the prime minister installed after a July coup has met recently with the Russian ambassador. In Mali and the Central African Republic, French troops have been replaced with Wagner mercenaries... Percepto's experience in French-speaking Africa offers a rare window into the round-the-clock information warfare that is shaping international politics - and the booming business of disinformation-for-hire. Meta, the social media company that operates Facebook, Instagram and WhatsApp, says that since 2017 it has detected more than 200 clandestine influence operations, many of them mercenary campaigns, in 68 countries. The article also makes an interesting point. "The burden of battling disinformation has fallen entirely on Silicon Valley companies."Read more of this story at Slashdot.
In 1999 cybersecurity pundit Bruce Schneier answered questions from Slashdot's readers. 24 years later on his personal blog, Schneier is still offering his insights. Last month Schneier said that warnings about millions of vacant cybersecurity positions around the world never made sense to me" - and then shared this alternate theory. From the blog of cybersecurity professional Ben Rothke:[T]here is not a shortage of security generalists, middle managers, and people who claim to be competent CISOs. Nor is there a shortage of thought leaders, advisors, or self-proclaimed cyber subject matter experts. What there is a shortage of are computer scientists, developers, engineers, and information security professionals who can code, understand technical security architecture, product security and application security specialists, analysts with threat hunting and incident response skills. And this is nothing that can be fixed by a newbie taking a six-month information security boot camp.... In fact, security roles are often not considered entry-level at all. Hiring managers assume you have some other background, usually technical before you are ready for an entry-level security job. Without those specific skills, it is difficult for a candidate to break into the profession. Job seekers learn that entry-level often means at least two to three years of work experience in a related field. Rothke's post offers two conclusions:"Human resources needs to understand how to effectively hire information security professionals. Expecting an HR generalist to find information security specialists is a fruitless endeavor at best.""So is there really an information security jobs crisis? Yes, but not in the way most people portray it to be."Read more of this story at Slashdot.
"A company backed by BlackRock has abandoned plans to build a 1,300-mile pipeline across the US Midwest to collect and store carbon emissions from the corn ethanol industry," reports Ars Technica. The move comes "following opposition from landowners and some environmental campaigners."Navigator CO2 on Friday said developing its carbon capture and storage (CCS) project called Heartland Greenway had been "challenging" because of the unpredictable nature of regulatory and government processes in South Dakota and Iowa. Navigator's decision to scrap its flagship $3.1 billion project - one of the biggest of its kind in the US - is a blow for a fledgling industry... It also represents a setback for the carbon-intensive corn ethanol refining industry, a pillar of the rural Midwestern economy which is targeting industry-scale CCS as a way to reduce emissions... The project faced opposition from local landowners, who expressed concerns about safety and property seizures, and some environmentalists who describe CO2 pipelines as dangerous and a way to prop up the fossil fuels industry, which already has a network of such infrastructure. Addressing the decision by Navigator, the Coalition To Stop CO2 Pipelines said it "celebrates this victory," but added: "we also know that the tax incentives made available by the federal government for carbon capture, transport and storage likely mean another entity will pick up Navigator's project, or find a different route through Illinois." The article cites one analyst at energy research firm Wood Mackenzie who believes this cancellation could benefit rival carbon-capture companies like Summit Carbon Solutions, which is planning an even larger network of CO2 pipelines throughout the Midwest, and could try to sign deals with Navigator's former customers.Read more of this story at Slashdot.
404 Media (working with Court Watch) reports on a $30 Million cash-for-Bitcoin laundering ring operating in the heart of New YorkFor years, a gang operating in New York allegedly offered a cash-for-Bitcoin service that generated at least $30 million, with men standing on street corners with plastic shopping bags full of money, drive-by pickups, and hundreds of thousands of dollars laid out on tables, according to court records. The records provide rare insight into an often unseen part of the criminal underworld: how hackers and drug traffickers convert their Bitcoin into cash outside of the online Bitcoin exchanges that ordinary people use. Rather than turning to sites like Coinbase, which often collaborate with and provide records to law enforcement if required, some criminals use underground, in-real-life Bitcoin exchanges like this gang which are allegedly criminal entities in their own right. In a long spanning investigation by the FBI involving a confidential source and undercover agents, one member of the crew said "that at least some of his clients made money by selling drugs, that his wealthiest clients were hackers, and that he had made approximately $30 million over the prior three years through the exchange of cash for virtual currency," the court records read. Thanks to user Slash_Account_Dot for sharing the news.Read more of this story at Slashdot.
"The Worker as Futurist project assists rank-and-file Amazon workers to write short speculative fiction," explains its web site. "In a world where massive corporations not only exploit people but monopolize the power of future-making, how can workers and other people fight and write back?" I couldn't find any short stories displayed on their site, but there are plans to publish a book next year collecting the workers' writing about "the world after Amazon" in print, online and in audiobook format. And there's also a podcast about "the world Amazon is building and the workers and writers struggling for different futures." From their web site:A 2022 pilot project saw over 25 workers gather online to discuss how SF shed light on their working conditions and futures. In 2023, 13 workers started to meet regularly to build their writing skills and learn about the future Amazon is compelling its workers to create... The Worker as Futurist project aims, in a small way, to place the power of the imagination back in the hands of workers. This effort is in solidarity with trade union mobilizations and workers self-organization at Amazon. It is also in solidarity with efforts by civil society to reign in Amazon's power. Four people involved with the project shared more details in the socialist magazine Jacobin:At stake is a kind of corporate storytelling, which goes beyond crass propaganda but works to harness the imagination. Like so many corporations, Amazon presents itself as surfing the wave of the future, responding to the relentless and positive force of the capitalist market with innovation and optimism. Such stories neatly exonerate the company and its beneficiaries from the consequences of their choices for workers and their world... WWS doesn't focus on science fiction. But it does show the radical power of the imagination that comes when workers don't just read inspiring words, but come together to write and thereby take the power of world-building and future-making back into their hands. This isn't finding individual commercial or literary success, but dignity, imagination, and common struggle... Our "Worker as Futurist" project returns the power of the speculative to workers, in the name of discovering something new about capitalism and the struggle for something different. We have tasked these workers with writing their own futures, in the face of imaginaries cultivated by Amazon that see the techno-overlords bestride the world and the stars. Thanks to funding from Canada's arms-length, government-funded Social Sciences and Humanities Research Council, our team of scholars, teachers, writers, and activists has been able to pay Amazon workers (warehouse workers, drivers, copy editors, MTurk workers, and more) to participate in a series of skill-building writing workshops and information sessions. In each of these online forums, we were joined by experts on speculative fiction, on Amazon, and on workers' struggles. At the end of this series of sessions, the participants were supported to draft the stories they wanted to tell about "The World After Amazon...." We must envision the futures we want in order to mobilize and fight for them together, rather than cede that future to those who would turn the stars into their own private sandbox. It is in the process of writing and sharing writing we can come to an awareness of something our working bodies know but that we cannot otherwise articulate or express. The rank-and-file worker - the target of daily exploitation, forced to build their boss's utopia - may have encrypted within them the key to destroying his world and building a new one.Read more of this story at Slashdot.
Linus Torvalds has said he bought a Dell XPS-13 with Ubuntu Linux for his daughter. Now ZDNet shares some trivia from the history of "the most well-known Linux laptop," citing a presentation by Barton George, Dell Technologies' Developer Community manager, at the Linux/open-source conference All Things Open:First, however, you should know that Dell has supported Linux desktops and laptops since the middle 2000s. In 2006, Michael Dell told me that Dell would be the first major PC vendor to release and support desktop Linux - and this proved to be a success. Barton George explained that Dell had always done great volume with these computers. Not volume, like the Windows machines, of course, but enough that Dell has always offered Linux-based - primarily Red Hat Enterprise Linux (RHEL) powered - workstations. Still, none of these machines really appealed to developers... George announced on his personal blog what Dell was planning, and his traffic went from 60 views a day to 15,000. Then, as now, there's a lot of interest in laptops that come with Linux ready to go... Dell got together with Canonical, Ubuntu Linux's parent company, to make sure all the drivers were in place for a top-notch Ubuntu Linux developer desktop experience. Indeed, the name 'Project Sputnik' is a nod to Mark Shuttleworth, Ubuntu founder and Canonical CEO. A decade before the project itself, Shuttleworth had spent eight days orbiting the Earth in a Soviet Soyuz spacecraft. George and the crew decided "Soyuz" didn't have an inspiring ring to it, so the company went with "Sputnik" instead. George continued: "We announced a beta program for the machine with a 10% off offer. We thought, well, we'll probably get 300 people. Instead, we got 6,000. This is where senior management said OK, you've got something real."Read more of this story at Slashdot.
Slashdot reader Striek remembers Silicon Valley's long history of open source develoipment - and how HashiCorp "made the controversial decision to change licenses from the Mozilla Public License to MariaDB's Business Source Licesne. The key difference between these two licenses is that the BSL limits its grant to "non-production use". HashiCorp's CEO is now predicting there would be aoeno more open source companies in Silicon Valleya unless the community rethinks how it protects innovation, reports The Stack:While open source advocates had slammed [HashiCorp's] license switch, CEO Dave McJannet described the reaction from its largest customers as "Great. Because you're a critical partner to us and we need you to be a big, big company." Indeed, he claimed that "A lot of the feedback was, 'we wished you had done that sooner'" - adding that the move had been discussed with the major cloud vendors ahead of the announcement. "Every vendor over the last three or four years that has reached any modicum of scale has come to the same conclusion," said McJannet. "It's just the realisation that the open source model has to evolve, given the incentives that are now in the market." He claimed the historic model of foundations was broken, as they were dominated by legacy vendors. Citing the case of Hadoop, he said: "They're a way for big companies to protect themselves from innovation, by making sure that if Hadoop becomes popular, IBM can take it and sell it for less because they are part of that foundation." The evolution to putting open source products on GitHub had worked "really, really well" but once a project became popular, there was an incentive for "clone vendors to start taking that stuff." He claimed that "My phone started ringing materially after we made our announcement from every open source startup in Silicon Valley going 'I think this is the right model'." He said the Linux Foundation's adoption of Open Tofu raised serious questions. "What does it say for the future of open source, if foundations will just take it and give it a home. That is tragic for open source innovation. I will tell you, if that were to happen, there'll be no more open source companies in Silicon Valley." Hashicorp also announced a beta using generative AI to produce new module tests, and HCP Vault Radar, which scans code for secrets, personally identifiable information, dependency vulnerabilities, and non-inclusive language.Read more of this story at Slashdot.
An anonymous reader shared this report from SciTechDaily:Since the 1980s, researchers have observed significant periods of unrest in a region of California's Eastern Sierra Nevada mountains characterized by swarms of earthquakes as well as the ground inflating and rising by almost half an inch per year during these periods. The activity is concerning because the area, called the Long Valley Caldera, sits atop a massive dormant supervolcano... What is behind the increased activity in the last few decades? Could it be that the area is preparing to erupt again? Or could the uptick in activity actually be a sign that the risk of a massive eruption is decreasing? To answer these questions, Caltech researchers have created the most detailed underground images to date of the Long Valley Caldera, reaching depths up to 10 kilometers within the Earth's crust. These high-resolution images reveal the structure of the earth beneath the caldera and show that the recent seismic activity is a result of fluids and gases being released as the area cools off and settles down. The work was conducted in the laboratory of Zhongwen Zhan (PhD '14), professor of geophysics. A paper describing the research was published on October 18 in the journal Science Advances. "We don't think the region is gearing up for another supervolcanic eruption, but the cooling process may release enough gas and liquid to cause earthquakes and small eruptions," says Zhan. "For example, in May 1980, there were four magnitude 6 earthquakes in the region alone."Read more of this story at Slashdot.
Breached web sites distribute malware to visitors by claiming they need to update their browser. But one group of attackers "have developed an ingenious way of keeping their malware from being taken down by security experts or law enforcement," reports security researcher Brian Krebs. "By hosting the malicious files on a decentralized, anonymous cryptocurrency blockchain." [W]hen Cloudflare blocked those accounts the attackers began storing their malicious files as cryptocurrency transactions in the Binance Smart Chain (BSC), a technology designed to run decentralized apps and "smart contracts," or coded agreements that execute actions automatically when certain conditions are met. Nati Tal, head of security at Guardio Labs, the research unit at Tel Aviv-based security firm Guardio, said the malicious scripts stitched into hacked WordPress sites will create a new smart contract on the BSC Blockchain, starting with a unique, attacker-controlled blockchain address and a set of instructions that defines the contract's functions and structure. When that contract is queried by a compromised website, it will return an obfuscated and malicious payload. "These contracts offer innovative ways to build applications and processes," Tal wrote along with his Guardio colleague Oleg Zaytsev. "Due to the publicly accessible and unchangeable nature of the blockchain, code can be hosted 'on-chain' without the ability for a takedown." Tal said hosting malicious files on the Binance Smart Chain is ideal for attackers because retrieving the malicious contract is a cost-free operation that was originally designed for the purpose of debugging contract execution issues without any real-world impact. "So you get a free, untracked, and robust way to get your data (the malicious payload) without leaving traces," Tal said. In response to questions from KrebsOnSecurity, the BNB Smart Chain (BSC) said its team is aware of the malware abusing its blockchain, and is actively addressing the issue. The company said all addresses associated with the spread of the malware have been blacklisted, and that its technicians had developed a model to detect future smart contracts that use similar methods to host malicious scripts. "This model is designed to proactively identify and mitigate potential threats before they can cause harm," BNB Smart Chain wrote. "The team is committed to ongoing monitoring of addresses that are involved in spreading malware scripts on the BSC. To enhance their efforts, the tech team is working on linking identified addresses that spread malicious scripts to centralized KYC [Know Your Customer] information, when possible."Read more of this story at Slashdot.
"The major online platforms are breaking up with news," reports the New York Times:Campbell Brown, Facebook's top news executive, said this month that she was leaving the company. Twitter, now known as X, removed headlines from the platform days later. The head of Instagram's Threads app, an X competitor, reiterated that his social network would not amplify news. Even Google - the strongest partner to news organizations over the past 10 years - has become less dependable, making publishers more wary of their reliance on the search giant. The company has laid off news employees in two recent team reorganizations, and some publishers say traffic from Google has tapered off... Some executives of the largest tech companies, like Adam Mosseri at Instagram, have said in no uncertain terms that hosting news on their sites can often be more trouble than it is worth because it generates polarized debates... Publishers seem resigned to the idea that traffic from the big tech companies will not return to what it once was. Even in the long-fractious relationship between publishers and tech platforms, the latest rift stands out - and the consequences for the news industry are stark. Many news companies have struggled to survive after the tech companies threw the industry's business model into upheaval more than a decade ago. One lifeline was the traffic - and, by extension, advertising - that came from sites like Facebook and Twitter. Now that traffic is disappearing. Top news sites got about 11.5% of their web traffic in the United States from social networks in September 2020, according to Similarweb, a data and analytics company. By September this year, it was down to 6.5%... The sharp decline in referral traffic from social media platforms over the past two years has hit all news publishers, including The New York Times. The Wall Street Journal noticed a decline starting about 18 months ago, according to a recording of a September staff meeting obtained by the Times. "We are at the mercy of social algorithms and tech giants for much of our distribution," Emma Tucker, the Journal's editor-in-chief, told the newsroom in the meeting... Google cut some members of its news partnership team in September, and this week it laid off as many as 45 workers from its Google News team, the Alphabet Workers Union said. (The Information, a tech news website, reported the Google News layoffs earlier.) "We've made some internal changes to streamline our organization," Jenn Crider, a Google spokesperson, said in a statement... Jaffer Zaidi [Google's vice president of global news partnerships], wrote in an internal memo reviewed by the Times that the team would be adopting more artificial intelligence. "We had to make some difficult decisions to better position our team for what lies ahead," he wrote... Privately, a number of publishers have discussed what a post-Google traffic future may look like and how to better prepare if Google's AI products become more popular and further bury links to news publications.Read more of this story at Slashdot.
Images from the James Webb Space Telescope "don't match scientists' models of how the universe formed," reports the Washington Post. "But it might not be time to dump the standard model of cosmology yet. "A recent analysis in the Astrophysical Journal Letters suggests an explanation for the surprisingly massive-seeming galaxies: brilliant, extremely bright bursts of newborn stars. The galaxies photographed by the telescope looked far too mature and large to have formed so fully so soon after the universe began, raising questions about scientists' assumptions of galaxy formation. But when researchers ran a variety of computer simulations of the universe's earliest days, they discovered that the galaxies probably are not as large as they seem. Instead, they attribute their brightness to a phenomenon called "bursty star formation." As clouds of dust and debris collapse, they form dense, high-temperature cores and become stars. Bursty galaxies spit out new stars in intermittent, bright bursts instead of creating stars more consistently. Usually, these galaxies are low in mass and take long breaks between starbursts. Because the galaxies in question look so bright in photos produced by the Webb telescope, scientists at first thought they were older and more massive. But bursty systems with the ability to produce extremely bright, abundant light may appear more massive than they really are. "Not only does this finding explain why young galaxies appear deceptively massive, it also fits within the standard model of cosmology," explains the announcement:In the new study, Guochao Sun, who led the study, Northwestern's, Claude-Andre Faucher-Giguere, the study's senior author, and their team used advanced computer simulations to model how galaxies formed right after the Big Bang. The simulations produced cosmic dawn galaxies that were just as bright as those observed by the JWST... Although other astrophysicists have hypothesized that bursty star formation could be responsible for the unusual brightness of galaxies at cosmic dawn, the Northwestern researchers are the first to use detailed computer simulations to prove it is possible. And they were able to do so without adding new factors that are unaligned with our standard model of the universe.Read more of this story at Slashdot.
Nearly two dozen species are being taken off America's endangered species list, reports CBS News, "because they are extinct, the U.S. Fish and Wildlife Service said Monday."Most of the species were listed under the Endangered Species Act in the 1970s or 1980s and were very low in numbers or likely already extinct at the time of listing. In the years since, "rigorous reviews of the best available science" have been conducted to determine whether the animals are extinct. "Federal protection came too late to reverse these species' decline, and it's a wake-up call on the importance of conserving imperiled species before it's too late," Service Director Martha Williams said. Scientists in 2019 warned that worldwide, 1 million species of plants and animals were at risk of extinction. There are more than 1,300 species listed as either endangered or threatened in the United States under the Endangered Species Act. The 21 species being removed include one mammal, 10 types of birds, two species of fish and eight types of mussels. Eight of the 21 species were found in Hawaii. From the agency's announcement:The 21 species extinctions highlight the importance of the Endangered Species Act and efforts to conserve species before declines become irreversible. The circumstances of each also underscore how human activity can drive species decline and extinction by contributing to habitat loss, overuse, and the introduction of invasive species and diseases... The Endangered Species Act has been highly effective and credited with saving 99% of listed species from extinction. Thus far, more than 100 species of plants and animals have been delisted based on recovery or reclassified from endangered to threatened based on improved conservation status, and hundreds more species are stable or improving thanks to the collaborative actions of Tribes, federal agencies, state and local governments, conservation organizations and private citizens. An official from the agency said in the announcement "The ultimate goal is to recover these species, so they no longer need the Act's protection."Read more of this story at Slashdot.
An anonymous reader quotes a report from CNN: China has unveiled plans to restrict exports of graphite -- a mineral crucial to the manufacture of batteries for electric vehicles (EVs) -- on national security grounds, the Ministry of Commerce and the General Administration of Customs said Friday. The announcement comes just days after the United States imposed additional limits on the kinds of semiconductors that American companies can sell to Chinese firms. China, which dominates the world's production and processing of graphite, says export permits will be needed, starting in December, for synthetic graphite material -- including high-purity, high-strength and high-density versions -- as well as for natural flake graphite. [...] According to the US Geological Survey (PDF), the market for graphite used in batteries has grown 250% globally since 2018. China was the world's leading graphite producer last year, accounting for an estimated 65% of global production, it said. Besides EVs, graphite is commonly used in the semiconductor, aerospace, chemical and steel industries. The export curbs were announced as China faces pressure from multiple governments over its commercial and trade practices. For more than a year, it has been embroiled in a tech war with the United States and its allies in Europe and Asia over access to advanced chips and chipmaking equipment. "At the moment both China and Western countries are engaged in a tit for tat, highlighting how protectionist measures often spread. Newton's third law that every action causes a reaction applies here, too," said Stefan Legge, head of tax and trade policy research at the University of St Gallen in Switzerland. "At the same time, both sides of the dispute also realize how costly it is if geopolitics trumps economics," he added.Read more of this story at Slashdot.
Stephen Clark reports via Ars Technica: Earlier this week, SpaceX launched for the 75th time this year, continuing a flight cadence that should see the company come close to 100 missions by the end of December. SpaceX plans to kick its launch rate into a higher gear in 2024. This will be largely driven by launches of upgraded Starlink satellites with the ability to connect directly with consumer cell phones, a service SpaceX calls "Starlink Direct to Cell," a company official told Ars this week. The goal next year is 12 launches per month, for a total of 144 Falcon rocket flights. Like this year, most of those missions will be primarily devoted to launching Starlink broadband satellites. So far in 2023, more than 60 percent of SpaceX's launches have delivered the company's own Starlink satellites into orbit. Here are some numbers. Last year, SpaceX launched 61 missions. In 2021, the number was 31. In the last 12 months, SpaceX has launched 88 Falcon rockets, plus one test flight of the company's much larger Starship rocket. SpaceX's success in recovering and reusing Falcon 9 boosters and payload fairings has been vital to making this possible. SpaceX has gone past the original goal of launching each Falcon 9 booster 10 times before a major overhaul, first to 15 flights, and then recently certifying boosters for up to 20 missions. Technicians can swap out parts like engines, fins, landing legs, and valves that malfunction in flight or show signs of wear. With so many launches planned next year, 20 flights is probably not a stopping point. "We might go a little higher," the SpaceX official said. SpaceX may also see an uptick in missions for external customers, like NASA, the U.S. Space Force, and commercial companies. "External demand for Falcon 9 and Falcon Heavy launches is 'steady,' the official said, but some customers that had launches scheduled for this year encountered delays with their satellites, moving them into 2024."Read more of this story at Slashdot.
Karen K. Ho reports via ARTnews: British Museum has announced plans to digitize its entire collection in order to increase security and public access, as well as ward off calls for the repatriation of items. The project will require 2.4 million records to upload or upgrade and is estimated to take five years to complete. The museum's announcement on October 18 came after the news 2,000 items had been stolen from the institution by a former staff member, identified in news reports as former curator Peter Higgs. About 350 have been recovered so far, and last month the museum launched a public appeal for assistance. [...] On the same day the British Museum announced its digitization initiative, Jones and board chairman George Osborne gave oral evidence to the UK Parliament's Culture, Media and Sport Committee. Their comments included an explanation of how the thefts occurred, policy changes made as a result, and how the museum will handle whistleblower complaints going forward. They also gave more details about the British Museum's strategy for digitizing its collection, estimated at a cost of $12.1 million. "We are not asking the taxpayer or the Government for the money; we hope to raise it privately," Osborne said. The increased digital access to the collection would also be part of the museum's response to requests for items to be returned or repatriated. "Part of our response can be: "They are available to you. Even if you cannot visit the museum, you are able to access them digitally." That is already available -- we have a pretty good website -- but we can use this as a moment to make that a lot better and a lot more accessible," Osborne said.Read more of this story at Slashdot.
An anonymous reader quotes a report from NBC News: The Supreme Court on Friday blocked in full a lower court ruling that would have curbed the Biden administration's ability to communicate with social media companies about contentious content on such issues as Covid-19. The decision in a short unsigned order (PDF) puts on hold a Louisiana-based judge's ruling in July that specific agencies and officials should be barred from meeting with companies to discuss whether certain content should be stifled. The Supreme Court also agreed to immediately take up the government's appeal, meaning it will hear arguments and issue a ruling on the merits in its current term, which runs until the end of June. Three conservative justices noted that they would have denied the application: Samuel Alito, Clarence Thomas and Neil Gorsuch. "At this time in the history of our country, what the court has done, I fear, will be seen by some as giving the government a green light to use heavy-handed tactics to skew the presentation of views on the medium that increasingly dominates the dissemination of news. That is most unfortunate," Alito wrote in a dissenting opinion. GOP attorneys general in Louisiana and Missouri, along with five social media users, filed the underlying lawsuit, alleging that U.S. government officials went too far in what they characterize as coercion of social media companies to address posts, especially those related to Covid-19. The individual plaintiffs include Covid-19 lockdown opponents and Jim Hoft, the owner of the right-wing website Gateway Pundit. They claim that the government's actions violated free speech protections under the Constitution's First Amendment.Read more of this story at Slashdot.
Shakrai writes: Multiple outlets are reporting that Apple TV Plus has cancelled Jon Stewart's popular show The Problem with Jon Stewart, reportedly over editorial disagreements with regards to planned stories on the People's Republic of China and AI. Fans and haters of Apple will both recall that Apple recently made changes to AirDrop, one of the few effective means Chinese dissidents and protesters had for exchanging information off-grid at scale, and will ask why Apple is apparently not only willing, but eager, to carry water for the PRC, overriding both human rights and practical business concerns in the process. "Apple approached Stewart directly and expressed its need for the host and his team to be 'aligned' with the company's views on topics discussed," reports The Verge, citing The Hollywood Reporter. "Rather than falling in line when Apple threatened to cancel the show, Stewart reportedly decided to walk."Read more of this story at Slashdot.
Umar Shakir reports via The Verge: Amazon is fulfilling a small part of its promise to switch from using plastic bubble mailers and air pillows to all recyclable paper packaging for its shipments. The company announced that it has outfitted one facility in Euclid, Ohio, with an upgraded packaging machine that can automatically fold custom-fit boxes to wrap some products, use paper mailers for small items, and slide in paper fillers instead of plastic ones in standard boxes. As Amazon transitions over to curbside recyclable packaging, it will "reduce the company's plastic waste and the amount of plastic pollution that can reach the seas," says Matt Littlejohn, senior vice president of Oceana, a conservation organization. However, Littlejohn questions Amazon's commitment to end plastic use in the US, its largest market, compared to the commitments it made for the UK, Germany, and other markets. Amazon says it'll be a "multiyear effort" to move US warehouses to recyclable paper. "Unfortunately, Amazon, in this announcement, did not make a clear, quantifiable, and time-bound commitment, so it is unclear when, where, and how much real plastic reduction there will be," Littlejohn says.Read more of this story at Slashdot.
echo123 shares a report from the Associated Press: Thousands of information technology workers contracting with U.S. companies have for years secretly sent millions of dollars of their wages to North Korea for use in its ballistic missile program, FBI and Department of Justice officials said. The Justice Department said Wednesday that IT workers dispatched and contracted by North Korea to work remotely with companies in St. Louis and elsewhere in the U.S. have been using false identities to get the jobs. The money they earned was funneled to the North Korean weapons program, FBI leaders said at a news conference in St. Louis. Court documents allege that North Korea's government dispatched thousands of skilled IT workers to live primarily in China and Russia with the goal of deceiving businesses from the U.S. and elsewhere into hiring them as freelance remote employees. The workers used various techniques to make it look like they were working in the U.S., including paying Americans to use their home Wi-Fi connections, said Jay Greenberg, special agent in charge of the St. Louis FBI office. Greenberg said any company that hired freelance IT workers "more than likely" hired someone participating in the scheme. An FBI spokeswoman said Thursday that the North Koreans contracted with companies across the U.S. and in some other countries. "We can tell you that there are thousands of North Korea IT workers that are part of this," spokeswoman Rebecca Wu said. Federal authorities announced the seizure of $1.5 million and 17 domain names as part of the investigation, which is ongoing. FBI officials said the scheme is so prevalent that companies must be extra vigilant in verifying whom they are hiring, including requiring interviewees to at least be seen via video. The IT workers generated millions of dollars a year in their wages to benefit North Korea's weapons programs. In some instances, the North Korean workers also infiltrated computer networks and stole information from the companies that hired them, the Justice Department said. They also maintained access for future hacking and extortion schemes, the agency said. Officials didn't name the companies that unknowingly hired North Korean workers, say when the practice began, or elaborate on how investigators became aware of it. But federal authorities have been aware of the scheme for some time.Read more of this story at Slashdot.
An anonymous reader quotes a report from Reuters: U.S. measures to limit the export of advanced artificial intelligence (AI) chips to China may create an opening for Huawei to expand in its $7 billion home market as the curbs force Nvidia to retreat, analysts say. While Nvidia has historically been the leading provider of AI chips in China with a market share exceeding 90%, Chinese firms including Huawei have been developing their own versions of Nvidia's best-selling chips, including the A100 and the H100 graphics processing units (GPU). Huawei's Ascend AI chips are comparable to Nvidia's in terms of raw computing power, analysts and some AI firms such as China's iFlyTek say, but they still lag behind in performance. Jiang Yifan, chief market analyst at brokerage Guotai Junan Securities, said another key limiting factor for Chinese firms was the reliance of most projects on Nvidia's chips and software ecosystem, but that could change with the U.S. restrictions. "This U.S. move, in my opinion, is actually giving Huawei's Ascend chips a huge gift," Jiang said in a post on his social media Weibo account. This opportunity, however, comes with several challenges. Many cutting edge AI projects are built with CUDA, a popular programming architecture Nvidia has pioneered, which has in turn given rise to a massive global ecosystem that has become capable of training highly sophisticated AI models such as OpenAI's GPT-4. Huawei own version is called CANN, and analysts say it is much more limited in terms of the AI models it is capable of training, meaning that Huawei's chips are far from a plug-and-play substitute for Nvidia. Woz Ahmed, a former chip design executive turned consultant, said that for Huawei to win Chinese clients from Nvidia, it must replicate the ecosystem Nvidia created, including supporting clients to move their data and models to Huawei's own platform. Intellectual property rights are also a problem, as many U.S. firms already hold key patents for GPUs, Ahmed said. "To get something that's in the ballpark, it is 5 or 10 years," he added.Read more of this story at Slashdot.
Long-time Slashdot reader Noryungi writes: OpenBSD 7.4 has been officially released. The 55th release of this BSD operating system, known for being security oriented, brings a lot of new things, including dynamic tracer, pfsync improvements, loads of security goodies and virtualization improvements. Grab your copy today! As mentioned by Phoronix's Michael Larabel, some of the key highlights include: - Dynamic Tracer (DT) and Utrace support on AMD64 and i386 OpenBSD- Power savings for those running OpenBSD 7.4 on Apple Silicon M1/M2 CPUs by allowing deep idle states when available for the idle loop and suspend- Support for the PCIe controller found on Apple M2 Pro/Max SoCs- Allow updating AMD CPU Microcode updating when a newer patch is available- A workaround for the AMD Zenbleed CPU bug- Various SMP improvements- Updating the Direct Rendering Manager (DRM) graphics driver support against the upstream Linux 6.1.55 state- New drivers for supporting various Qualcomm SoC features- Support for soft RAID disks was improved for the OpenBSD installer- Enabling of Indirect Branch Tracking (IBT) on x86_64 and Branch Target Identifier (BTI) on ARM64 for capable processorsYou can download and view all the new changes via OpenBSD.org.Read more of this story at Slashdot.
An anonymous reader shares a Tom's Hardware report: Unfortunately, a default setting in Windows 11 Pro, having its software BitLocker encryption enabled, robs as much as 45 percent of the speed from your SSD as it forces your processor to encrypt and decrypt everything. According to our tests, random writes and reads -- which affect the overall performance of your PC -- get hurt the most, but even large sequential transfers are affected. While many SSDs come with hardware-based encryption, which does all the processing directly on the drive, Windows 11 Pro force-enables the software version of BitLocker during installation, without providing a clear way to opt out. (You can circumvent this with tools like Rufus, if you want, though that's obviously not an official solution as it allows users to bypass the Microsoft's intent.) If you bought a prebuilt PC with Windows 11 Pro, there's a good chance software BitLocker is enabled on it right now. Windows 11 Home doesn't support BitLocker so you won't have encryption enabled there. To find out just how much software BitLocker impacts performance, we ran a series of tests with three scenarios: unencrypted (no BitLocker), software BitLocker (the Windows 11 Pro default), and with hardware BitLocker (OPAL) enabled. While the software encryption increased latency and decreased transfer rates, hardware encryption and no encryption at all were basically tied. If you have software BitLocker enabled, you may want to change your settings.Read more of this story at Slashdot.
Toyota and BMW are two of the latest automakers to announce they're adopting Tesla's North American Charging System (NACS) plug for their North American EVs, giving drivers access to Tesla's Supercharger network. Ars Technica reports: BMW's announcement applies to all its car brands, which means that in addition to EVs like the BMW i5 or i7, it's also swapping over to NACS for the upcoming Mini EVs as well as the Rolls-Royce Spectre. BMW will start adding native NACS ports to its EVs in 2025, and that same year its customers will gain access to the Tesla Supercharger network. BMW's release doesn't explicitly mention a CCS1-NACS adapter being made available, but it does say that BMW (and Mini and Rolls-Royce) EVs with CCS1 ports will be able to use Superchargers from early 2025. Similarly, the Toyota news applies to its brand as well as Lexus. Toyota says that it will start incorporating NACS ports into "certain Toyota and Lexus BEVs starting in 2025." And customers with Toyota or Lexus EVs that have a CCS1 port will be offered an adapter allowing them to use NACS chargers, also in 2025. And -- you guessed it -- 2025 is when Toyota and Lexus EVs gain access to the Supercharger network. While virtually all the brands that sell EVs in the North American market have announced the switch, there are still a couple holdouts. Stellantis has yet to make the switch, "meaning Alfa Romeo, Chrysler, Dodge, Fiat, Jeep, Maserati, and Ram are all sticking with CCS1 for now," reports Ars. "Volkswagen Group has also yet to take the plunge, which means that Audi and Porsche are also staying with CCS1 for now, as well as the soon-to-be-reborn Scout brand." That said, they're expected to announce a switch to the NACS plug any day now.Read more of this story at Slashdot.
An anonymous reader quotes a report from Krebs on Security: Okta, a company that provides identity tools like multi-factor authentication and single sign-on to thousands of businesses, has suffered a security breach involving a compromise of its customer support unit, KrebsOnSecurity has learned. Okta says the incident affected a "very small number" of customers, however it appears the hackers responsible had access to Okta's support platform for at least two weeks before the company fully contained the intrusion. In an advisory sent to an undisclosed number of customers on Oct. 19, Okta said it "has identified adversarial activity that leveraged access to a stolen credential to access Okta's support case management system. The threat actor was able to view files uploaded by certain Okta customers as part of recent support cases." Okta explained that when it is troubleshooting issues with customers it will often ask for a recording of a Web browser session (a.k.a. an HTTP Archive or HAR file). These are sensitive files because in this case they include the customer's cookies and session tokens, which intruders can then use to impersonate valid users. "Okta has worked with impacted customers to investigate, and has taken measures to protect our customers, including the revocation of embedded session tokens," their notice continued. "In general, Okta recommends sanitizing all credentials and cookies/session tokens within a HAR file before sharing it." Okta has published a blog post about this incident that includes some "indicators of compromise" that customers can use to see if they were affected. But the company stressed that "all customers who were impacted by this have been notified. If you're an Okta customer and you have not been contacted with another message or method, there is no impact to your Okta environment or your support tickets." The security firm BeyondTrust is among the Okta customers who was involved in the breach. "BeyondTrust Chief Technology Officer Marc Maiffret said that [Okta's] alert came more than two weeks after his company alerted Okta to a potential problem," reports Krebs. They have also published a blog post detailing their findings.Read more of this story at Slashdot.
Can you heat up a pan to 30,000 degrees Fahrenheit? That's the burning question at the center of this proposed class action lawsuit, which claims the advertising for SharkNinja's nonstick cookware violates the laws of physics and thermodynamics. From a report: While SharkNinja is the company best known for its Shark robovacs and Ninja kitchen gadget, this lawsuit takes issue with the Ninja NeverStick Premium Cookware collection, a line of pots and pans it advertises as having superior nonsticking and nonflaking qualities thanks to its manufacturing process. Instead of making its pans at a measly 900-degree temperature that other brands use, SharkNinja says it heats up the cookware to a maximum of 30,000 degrees Fahrenheit. That process, according to SharkNinja, fuses "plasma ceramic particles" to the surface of the pan, "creating a super-hard, textured surface that interlocks with our exclusive coating for a superior bond." But Patricia Brown, the person who filed this lawsuit, isn't buying it. As cited in Brown's lawsuit, NASA recently said the "surface of the Sun is a blisteringly hot 10,340 degrees Fahrenheit," meaning SharkNinja's manufacturing process reaches about three times that temperature.Read more of this story at Slashdot.
Redis, the go-to in-memory database used as a cache and system broker, is looking to include disk as part of a tiered storage architecture to reduce costs and broaden the system's appeal. From a report: Speaking to The Register, CEO Rowan Trollope said he hoped the move would help customers lower costs and simplify their architecture. Redis counts Twitter X, Snapchat, and Craigslist among its customers, and it's popular among developers of modern internet-scale applications owing to its ability to create a cache to prevent the main database from overloading. Trollope said the sub-millisecond distributed system gives devs the performance they need, but admitted other systems built for internet scale, such as MongoDB, might offer price advantages. To address this, the company has already created a tiered approach to memory by offering flash support behind its in-memory system. "We have a half-step between disk and memory. For some specific use cases, in gaming for example, a company might use us for leaderboards and other in-game stats, which they need in real time," he said. However, after an initial flush of the game launch, a large chunk of users would finish the game and their accounts would go dormant until the release of a new episode or some new content, when they might return. Trollope said using flash allowed users to dynamically tier memory. "We can take the lesser-used data that hasn't been touched in a while and shuttle it off to flash where it can sit for a while. When the user comes back eventually, it's very easy for us to seamlessly move it from flash back into memory. And that allows the company to save costs," he said.Read more of this story at Slashdot.
Nvidia Research announced today that it has developed a new AI agent, called Eureka, that is powered by OpenAI's GPT-4 and can autonomously teach robots complex skills. From a report: In a blog post, the company said Eureka, which autonomously writes reward algorithms, has, for the first time, trained a robotic hand to perform rapid pen-spinning tricks as well as a human can. Eureka has also taught robots to open drawers and cabinets, toss and catch balls, and manipulate scissors, among nearly 30 tasks. "Reinforcement learning has enabled impressive wins over the last decade, yet many challenges still exist, such as reward design, which remains a trial-and-error process," Anima Anandkumar, senior director of AI research at Nvidia and an author of the Eureka paper, said in the blog post. "Eureka is a first step toward developing new algorithms that integrate generative and reinforcement learning methods to solve hard tasks."Read more of this story at Slashdot.
An anonymous reader shares a report: A brain-inspired computer chip that could supercharge artificial intelligence by working faster with much less power has been developed by researchers at IBM in San Jose, California. Their massive NorthPole processor chip eliminates the need to frequently access external memory, and so performs tasks such as image recognition faster than existing architectures do -- while consuming vastly less power. "Its energy efficiency is just mind-blowing," says Damien Querlioz, a nanoelectronics researcher at the University of Paris-Saclay in Palaiseau. The work, published in Science, shows that computing and memory can be integrated on a large scale, he says. "I feel the paper will shake the common thinking in computer architecture." NorthPole runs neural networks: multi-layered arrays of simple computational units programmed to recognize patterns in data. A bottom layer takes in data, such as the pixels in an image; each successive layer detects patterns of increasing complexity and passes information on to the next layer. The top layer produces an output that, for example, can express how likely an image is to contain a cat, a car or other objectRead more of this story at Slashdot.
Gaming analytics and esports brand company Gamesquare, which counts Dallas Cowboys owner Jerry Jones as one of its investors, is acquiring the struggling gaming influencer group Faze Clan. From a report: The all-stock deal is worth about $17 million, Bloomberg reports, a steep drop-off from Faze's $725-million valuation at the time of its special purpose acquisition company, SPAC, merger in July of 2022. Since the SPAC made it publicly traded on the Nasdaq exchange, Faze Clan, like much of the esports industry, has struggled, with the company posting a $28.4-million loss "through the first half of 2023," according to Bloomberg. Last month, Faze Clan's troubles reached an inflection point that led to the firing of CEO Lee Trink, who once compared the company to the rise of hip-hop during an interview on The Vergecast.Read more of this story at Slashdot.
Pfizer this week revealed that it raised the list price of a course of Paxlovid -- its lifesaving antiviral drug used to reduce the risk of severe COVID-19 in those most vulnerable -- to nearly $1,400, more than double the roughly $530 the US government has paid for the treatment in the emergency phase of the pandemic. From a report: Pfizer CEO Albert Bourla had noted in an investor call at the beginning of the week that the company would increase the price of Paxlovid as it moves from government distribution to the commercial market at the end of this year. But, he did not announce the new list price then. Instead, the company revealed the more than twofold increase in a letter to pharmacies and clinics dated Wednesday. The Wall Street Journal was the first to report the list price of $1,390 after viewing the letter. A Pfizer spokesperson told the Journal that "pricing for Paxlovid is based on the value it provides to patients, providers, and health care systems due to its important role in helping reduce COVID-19-related hospitalizations and deaths." A cost-effectiveness analysis last year determined the value of Paxlovid at between $563 and $906 per treatment course, according to nonprofit drug-pricing watchdog The Institute for Clinical and Economic Review.Read more of this story at Slashdot.