Feed aggregator

FSF Says Nintendo's New DRM Allows Them to Remotely Render User's Device 'Permanently Unusuable'

Slashdot.org - Sun, 12/21/2025 - 11:34
"In the lead up to its Switch 2 console release, Nintendo updated its user agreement," writes the Free Software Foundation, warning that Nintendo now claims "broad authority to make consoles owned by its customers permanently unusable." "Under Nintendo's most aggressive digital restrictions management (DRM) update to date, game console owners are now required to give Nintendo the unilateral right to revoke access to games, security updates, and the Internet, at its sole discretion." The new agreement states: "You acknowledge that if you fail to comply with [Nintendo's restrictions], Nintendo may render the Nintendo Account Services and/or the applicable Nintendo device permanently unusable in whole or in part...." There are probably other reasons that Nintendo has and will justify bricking game consoles, but here are some that we have seen reported: — "Tampering" with hardware or software in pretty much any way; — Attempting to play a back-up game; — Playing a "used" game; or — Use of a third-party game or accessory... Nintendo's promise to block a user from using their game console isn't just an empty threat: it has already been wielded against many users. For example, within a month of the Switch 2's release, one user unknowingly purchased an open-box return that had been bricked, and despite functional hardware, it was unusable for many games. In another case, a user installing updates for game cartridges purchased via a digital marketplace had their console disabled. Though it's unclear exactly why they were banned, it's possible that the cartridge's previous owner made a copy and an online DRM check determined that the current and previous owner's use were both "fraudulent." The user only had their console released through appealing to Nintendo directly and providing evidence of their purchase, a laborious process. Nintendo's new console banning spree is just one instance of the threat that nonfree software and DRM pose to users. DRM is but one injustice posed by nonfree software, and the target of the FSF's Defective by Design campaign. Like with all software, users ought to be able to freely copy, study, and modify the programs running on their devices. Proprietary software developers actively oppose and antagonize their users. In the case of Nintendo, this means punishing legitimate users and burdening them with proving that their use is "acceptable." Console users shouldn't have to tread so carefully with a console that they own, and should they misstep, beg Nintendo to allow them to use their consoles again.

Read more of this story at Slashdot.

Trump Admin to Hire 1,000 for New 'Tech Force' to Build AI Infrastructure

Slashdot.org - Sun, 12/21/2025 - 10:34
An anonymous reader shared this report from CNBC: The Trump administration on Monday unveiled a new initiative dubbed the "U.S. Tech Force," comprising about 1,000 engineers and other specialists who will work on artificial intelligence infrastructure and other technology projects throughout the federal government. Participants will commit to a two-year employment program working with teams that report directly to agency leaders in "collaboration with leading technology companies," according to an official government website. ["...and work closely with senior managers from companies partnering with the Tech Force."] Those "private sector partners" include Amazon Web Services, Apple, Google Public Sector, Dell Technologies, Microsoft, Nvidia, OpenAI, Oracle, Palantir, Salesforce and numerous others [including AMD, IBM, Coinbase, Robinhood, Uber, xAI, and Zoom], the website says. The Tech Force shows the Trump administration increasing its focus on developing America's AI infrastructure as it competes with China for dominance in the rapidly growing industry... The engineering corps will be working on "high-impact technology initiatives including AI implementation, application development, data modernization, and digital service delivery across federal agencies," the site says. "Answer the call," says the new web site at TechForce.gov. "Upon completing the program, engineers can seek employment with the partnering private-sector companies for potential full-time roles — demonstrating the value of combining civil service with technical expertise." [And those private sector companies can also nominate employees to participate.] "Annual salaries are expected to be in the approximate range of $150,000 to $200,000."

Read more of this story at Slashdot.

While Releasing 'Avatar 3', James Cameron Questions the Future of Movies

Slashdot.org - Sun, 12/21/2025 - 07:34
"If I get to do another Avatar film, it'll be because the business model still works," James Cameron tells CNN in a video interview — adding "That I can't guarantee, as I sit here today. That'll play out over the next month, really." He says theatre is a "sacred space," and while it will never go away, "I think that it could fall below a threshhold where the kinds of movies that I like to make and that I like to see... won't be sustainable, they won't be economically viable. And that can happen. We're very close to that right now." The Wrap notes he filmed his new movie at the same time as its predecessor, The Way of Water." "We did all the performance capture in an 18-month period for both films. Then we did a lot of the virtual camera work to figure out exactly how we were going to do the live-action," Cameron explained. "Then we did all live-action together for both films. Then we split it and said, All right, now we just got to finish [movie] two....." While Cameron has been iffy about whether the previously announced fourth and fifth films will actually happen, he has already shot some of the fourth movie. "We're in a fluid scenario. Theatrical's contracting, streaming is expanding. People's habit patterns are changing. The teen demo consumes media differently than what we grew up with. And how much is it changing? Does theatrical contract to a point where it just stops right and doesn't get any smaller because we still value that, or does it continue to wither away?" Cameron said. It's a theme he continued in his interview with The Hollywood Reporter" "This can be the last one. There's only one [unanswered question] in the story. We may find that the release of Avatar 3 proves how diminished the cinematic experience is these days, or we may find it proves the case that it's as strong as it ever was — but only for certain types of films. It's a coin toss right now. We won't know until the middle of January." I ask something that might sound odd: What do you want to happen? But Cameron gets the implication. "That's an interesting question," he says. "I feel I'm at a bit of a crossroads. Do I want it to be a wild success — which almost compels me to continue and make two more Avatar movies? Or do I want it to fail just enough that I can justify doing somethingelse...?" "What won't happen is, I won't go down the rabbit hole of exclusively making only Avatar for multiple years. I'm going to figure out another way that involves more collaboration. I'm not saying I'm going to step away as a director, but I'm going to pull back from being as hands-on with every tiny aspect of the process..." Cameron won't reveal his next project — and he might even be unsure himself — but will give intriguing hints. In addition to co-directing Billie Eilish's upcoming 3D concert documentary, Hit Me Hard and Soft, Cameron has another globe-trotting documentary adventure in the works, the details of which are under wraps. His next narrative film probably won't be Ghosts of Hiroshima, which has generated considerable press after Cameron acquired the rights to Charles Pellegrino's book chronicling the true story of Tsutomu Yamaguchi, who in 1945 survived the nuclear blasts at both Hiroshima and Nagasaki. Cameron promised Yamaguchi on his deathbed in 2010 that he'd makethefilm. "The postapocalypse is not going to be the fun that it is in science fiction," he says. "It's not going to have mutants and monsters and all sorts of cool stuff. It's hell...." Cameron first portrayed the apocalypse in his 1984 debut, The Terminator, a franchise he's quietly working on revisiting. "Once the dust clears on Avatar in a couple of months, I'm going to really plunge into that," he says. "There are a lot of narrative problems to solve. The biggest is how do I stay enough ahead of what's really happening to make it science fiction?" Asked whether he's cracked the premise, Cameron replies, "I'm working on it," but his sly smile suggests that he has.... There needs to be a broader interpretation of Terminator and the idea of a time war and super intelligence. I want to do new stuff that people aren't imagining." Maybe Cameron's best response was what he told USA Today: "Let's do another interview in a year and then I'll tell you what my plans are," Cameron, 71, says with a grin. For now, he's still catching his breath.

Read more of this story at Slashdot.

Hiero Open Source - Hedera

Linux News - Sun, 12/21/2025 - 07:01
Hiero Open Source  Hedera
Categories: Linux

Is America's Tech Industry Already Facing a Recession?

Slashdot.org - Sun, 12/21/2025 - 03:34
America's unemployment rate for tech jobs rose to 4% in November, and "has been steadily rising since May," reports the Washington Post (citing data from the IT training/certifications company CompTIA). Between October and November, the number of technology workers across different industries fell 134,000, while the number of people working in the tech industry declined by more than 6,800. Tech job postings were also down by more than 31,800, the report found, citing data from the Bureau of Labor Statistics and California-based market intelligence firm Lightcast. "The data is pretty definitive that the tech industry is struggling," said Mark Zandi, Moody's chief economist. "There's a jobs recession in the industry, and it feels like that's going to continue given the slide in postings...." The unemployment rate in the tech industry still sits below the national rate, which in November hit 4.6 percent, the highest since 2021. However, that gap has been narrowing, with tech unemployment rising faster in recent months than is the case nationally.... Employers are largely in "wait and see" mode when it comes to hiring given the current uncertainties surrounding the economy and impact of AI, so they're likely to delay backfilling, Herbert said, citing CompTIA's surveys of chief information officers. But Justin Wolfers, professor of public policy and economics at the University of Michigan, said uncertainty is likely to continue in the foreseeable future. "I'm feeling substantially more pessimistic," Wolfers said, recalling that Federal Reserve Chair Jerome H. Powell recently suggested that federal job numbers may be overstated. "That's pretty grim." Technology companies have announced more than 141,000 job cuts so far this year, representing a 17 percent increase from the same period last year, according to outplacement firm Challenger, Gray & Christmas. At the same time Big Tech companies like Google, Microsoft, Meta and Amazon have announced plans to invest up to $375 billion in AI infrastructure this year. "AI is quickly becoming a requirement, with 41 percent of all active job postings representing AI roles or requiring AI skills, according to CompTIA's analysis," the article points out. Economist Zandi tells the Post that "If you have AI skills, there seems to be jobs. But if you don't, I think it's going to feel like you've been hit by a dump truck."

Read more of this story at Slashdot.

Rust's 'Vision Doc' Makes Recommendations to Help Keep Rust Growing

Slashdot.org - Sun, 12/21/2025 - 00:34
The team authoring the Rust 2025 Vision Doc interviewed Rust developers to find out what they liked about the language — and have now issued three recommendations "to help Rust continue to scale across domains and usage levels." — Enumerate and describe Rust's design goals and integrate them into our processes, helping to ensure they are observed by future language designers and the broader ecosystem. — Double down on extensibility, introducing the ability for crates to influence the develop experience and the compilation pipeline. — Help users to navigate the crates.io ecosystem and enable smoother interop The real "empowering magic" of Rust arises from achieving a number of different attributes all at once — reliability, efficiency, low-level control, supportiveness, and so forth. It would be valuable to have a canonical list of those values that we could collectively refer to as a community and that we could use when evaluating RFCs or other proposed designs... We recommend creating an RFC that defines the goals we are shooting for as we work on Rust... One insight from our research is that we don't need to define which values are "most important". We've seen that for Rust to truly work, it must achieveallthe factors at once... We recommenddoubling down on extensibilityas a core strategy. Rust's extensibility — traits, macros, operator overloading — has been key to its versatility. But that extensibility is currently concentrated in certain areas: the type system and early-stage proc macros. We should expand it to coversupportive interfaces(better diagnostics and guidance from crates) andcompilation workflow(letting crates integrate at more stages of the build process)... Doubling down on extensibility will not only make current Rust easier to use, it will enable and support Rust's use in new domains. Safety Critical applications in particular require a host of custom lints and tooling to support the associated standards. Compiler extensibility allows Rust to support those niche needs in a more general way. We recommend finding ways to help users navigate the crates.io ecosystem... [F]inding which crates to use presents a real obstacle when people are getting started. The Rust org maintains a carefully neutral stance, which is good, but also means that people don't have anywhere to go for advice on a good "starter set" crates... Part of the solution is enabling better interop between libraries.

Read more of this story at Slashdot.

Disney is Doing Cross-Site Authentication All Wrong

BrandonChecketts.com - Sat, 12/20/2025 - 21:20

Disney runs quite a few properties including disneyplus.com, hulu.com, espn.com, abc.com, and a bunch of obviously Disney sites like shopdisney.com, disneyworld.disney.go.com, and disneycruise.disney.go.com. They have a centralized authentication system so all of these sites can use the same email address and password to log in.

It has a couple major problems though:

  1. It isn’t obvious that the login is shared. They share a logo when logging in, but its not obvious to users that these sites share the same credentials. I wouldn’t expect that espn.com uses the same login as hulu.com and I know that Disney owns both of them! Also, password managers aren’t aware that the logins are tied together, so when you log in to one site and your password doesn’t work because you don’t realize they are shared, you end up resetting it. And then it broke your password for another site that you didn’t realize was connected
  2. Users can’t verify that a site is legitimate. It would be trivial for an attacker to create a fake Disney site and mimic the Disney login system to capture passwords. I actually noticed this because my wife was logging into a site for Disney gift cards and I seriously throught it was a scam

Disney should implement a shared login that uses a common login site (like login.disney.com) so that users can know that it is a legitimate Disney site. This fixes the issues above. Users can know that they trust login.disney.com. Password managers will use the same credentials. And it will be more difficult for attackers to mimic a site if users know that login.disney.com is the only legitimate place to log in

The post Disney is Doing Cross-Site Authentication All Wrong appeared first on Brandon Checketts.

Categories: Web

Bell Labs 'Unix' Tape from 1974 Successfully Dumped to a Tarball

Slashdot.org - Sat, 12/20/2025 - 21:02
Archive.org now has a page with "the raw analog waveform and the reconstructed digital tape image (analog.tap), read at the Computer History Museum's Shustek Research Archives on 19 December 2025 by Al Kossow using a modified tape reader and analyzed with Len Shustek's readtape tool." A Berlin-based retrocomputing enthusiast has created a page with the contents of the tape ready for bootstrapping, "including a tar file of the filesystem," and instructions on dumping an RK05 disk image from tape to disk (and what to do next). Research professor Rob Ricci at the University of Utah's school of computing posted pictures and video of the tape-reading process, along with several updates. ("So far some of our folks think they have found Hunt The Wumpus and the C code for a Snobol interpreter.") University researcher Mike Hibler noted the code predates the famous comment "You are not expected to understand this" — and found part of the C compiler with a copyright of 1972. The version of Unix recovered seems to have some (but not all) of the commands that later appeared in Unix v5, according to discussion on social media. "UNIX wasn't versioned as we know it today," explains University of Utah PhD student Thalia Archibald, who researched early Unix history (including the tape) and also worked on its upload. "In the early days, when you wanted to cut a tape, you'd ask Ken if it was a good day — whether the system was relatively bug-free — and copy off the research machine... I've been saying It's probably V5 minus a tiny bit, which turned out to be quite true."

Read more of this story at Slashdot.

Does AI Really Make Coders Faster?

Slashdot.org - Sat, 12/20/2025 - 18:38
One developer tells MIT Technology Review that AI tools weaken the coding instincts he used to have. And beyond that, "It's just not fun sitting there with my work being done for me." But is AI making coders faster? "After speaking to more than 30 developers, technology executives, analysts, and researchers, MIT Technology Review found that the picture is not as straightforward as it might seem..." For some developers on the front lines, initial enthusiasm is waning as they bump up against the technology's limitations. And as a growing body of research suggests that the claimed productivity gains may be illusory, some are questioning whether the emperor is wearing any clothes.... Data from the developer analytics firm GitClear shows that most engineers are producing roughly 10% more durable code — code that isn't deleted or rewritten within weeks — since 2022, likely thanks to AI. But that gain has come with sharp declines in several measures of code quality. Stack Overflow's survey also found trust and positive sentiment toward AI tools falling significantly for the first time. And most provocatively, a July study by the nonprofit research organization Model Evaluation & Threat Research (METR) showed that while experienced developers believed AI made them 20% faster, objective tests showed they were actually 19% slower... Developers interviewed by MIT Technology Review generally agree on where AI tools excel: producing "boilerplate code" (reusable chunks of code repeated in multiple places with little modification), writing tests, fixing bugs, and explaining unfamiliar code to new developers. Several noted that AI helps overcome the "blank page problem" by offering an imperfect first stab to get a developer's creative juices flowing. It can also let nontechnical colleagues quickly prototype software features, easing the load on already overworked engineers. These tasks can be tedious, and developers are typically glad to hand them off. But they represent only a small part of an experienced engineer's workload. For the more complex problems where engineers really earn their bread, many developers told MIT Technology Review, the tools face significant hurdles... The models also just get things wrong. Like all LLMs, coding models are prone to "hallucinating" — it's an issue built into how they work. But because the code they output looks so polished, errors can be difficult to detect, says James Liu, director of software engineering at the advertising technology company Mediaocean. Put all these flaws together, and using these tools can feel a lot like pulling a lever on a one-armed bandit. "Some projects you get a 20x improvement in terms of speed or efficiency," says Liu. "On other things, it just falls flat on its face, and you spend all this time trying to coax it into granting you the wish that you wanted and it's just not going to..." There are also more specific security concerns, she says. Researchers have discovered a worrying class of hallucinations where models reference nonexistent software packages in their code. Attackers can exploit this by creating packages with those names that harbor vulnerabilities, which the model or developer may then unwittingly incorporate into software. Other key points from the article: LLMs can only hold limited amounts of information in context windows, so "they struggle to parse large code bases and are prone to forgetting what they're doing on longer tasks." "While an LLM-generated response to a problem may work in isolation, software is made up of hundreds of interconnected modules. If these aren't built with consideration for other parts of the software, it can quickly lead to a tangled, inconsistent code base that's hard for humans to parse and, more important, to maintain." "Accumulating technical debt is inevitable in most projects, but AI tools make it much easier for time-pressured engineers to cut corners, says GitClear's Harding. And GitClear's data suggests this is happening at scale..." "As models improve, the code they produce is becoming increasingly verbose and complex, says Tariq Shaukat, CEO of Sonar, which makes tools for checking code quality. This is driving down the number of obvious bugs and security vulnerabilities, he says, but at the cost of increasing the number of 'code smells' — harder-to-pinpoint flaws that lead to maintenance problems and technical debt." Yet the article cites a recent Stanford University study that found employment among software developers aged 22 to 25 dropped nearly 20% between 2022 and 2025, "coinciding with the rise of AI-powered coding tools." The story is part of MIT Technology Review's new Hype Correction series of articles about AI.

Read more of this story at Slashdot.

Syndicate content
Comment