Story

Still here and still important: FreeDOS and its loyal supporters

by
in hardware on (#3QR)
story imageWho cares that it's 20 years old: FreeDOS is still around, fulfilling an interesting and valuable role in the world of tech, and what's more, is ardently supported and appreciated by a loyal core of users and developers. Sean Gallagher over at Ars Technica interviews the FreeDOS lead developer, Jim Hall, to find out why FreeDOS still fills a niche:
Because FreeDOS is, as some have called it, "barely an operating system," it allows developers to get very, very close to the hardware. Most modern operating systems have been built specifically to avoid this for security and stability reasons. But FreeDOS has become much more friendly to virtualization and hardware emulation-it's even the heart of the DOSEMU emulator

The direction the project has taken hasn't exactly followed the road map Hall envisioned after version 1.0. He once had ambitious plans for a next-generation of DOS, originally envisioning a modern FreeDOS along the lines of an alternative history of computing. "For a while, I was thinking, 'If MS DOS survived, where would DOS have gone in the last 10 to 15 years?'" Hall said. "I was advocating some sort of multitasking-we could have task switching like what was supported in the 286, where you can put one process to sleep while you do another process. I wanted to have TCP/IP added to kernel."
FreeDOS might hail from the era before networking but it's inherently real-time, provides a great teaching tool that allows you to get close to the bare metal, and remains deliciously uncomplicated. That's also the opinion of Gallagher, who spent a whole day {gasp!} running DOS just to remember what it's like. Now get offa my lawn.

Is Wikipedia just as good when the articles are written by a script?

by
in internet on (#3QQ)
story imageAt its core, it's a question of quantity versus quality, or the right to access information. But it's also a question about the role humans should play in an ostensibly human-edited encyclopedia. Here not to provoke those questions but simply to add information to Wikipedia is a Swede by the name of Sverker Johansson. He is single-handedly responsible for 2.7 million articles on wikipedia (8.5% of the entire site). But 'single-handedly' isn't quite right: he wrote and deploys a bot.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries. On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.

Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group." While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
Complex questions are at play here: is it better Wikipedia lack articles that humans can't or won't write? Can robot-written articles be trusted? Should they be labeled and approved? What kind of criteria would be applied and who would fund/oversee this kind of oversight body? And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?

More coverage at Business Spectator (Australia) and Popular Science.

New Raspberry Pi B+ announced

by
in hardware on (#3QP)
story imageThe Raspberry Pi keeps getting better: an updated version of model B Raspberry Pi has been announced by the Foundation. Looks like a great device.

Main changes are:
  1. More GPIO. The GPIO header has grown to 40 pins, while retaining the same pinout for the first 26 pins as the Model B.
  2. More USB. We now have 4 USB 2.0 ports, compared to 2 on the Model B, and better hotplug and overcurrent behaviour.
  3. Micro SD. The old friction-fit SD card socket has been replaced with a much nicer push-push micro SD version.
  4. Lower power consumption. By replacing linear regulators with switching ones we've reduced power consumption by between 0.5W and 1W.
  5. Better audio. The audio circuit incorporates a dedicated low-noise power supply.
  6. Neater form factor. We've aligned the USB connectors with the board edge, moved composite video onto the 3.5mm jack, and added four squarely-placed mounting holes.
The price is still $35.

Interestingly, Hackaday was there first, via a post just yesterday about a guy who got one of these things even before they were announced: probably a simple shipping error. They point out that the new form factor means old cases won't work. And they - and I - are excited about the better SD slot too: that was an important defect.

What happens when digital communities are abandoned?

by
in games on (#3QN)
story imageLaura Hall over at Atlantic Magazine asks: what happens when digital communities are abandoned? Although she covers the closing of Geocities, she's really more interested in the virtual worlds we build in our MUDs and immersive games. Though they swarm with players for long periods of time, as user interest wanes and gamers go elsewhere, then what do they look like?
When Second Life launched in 2003, the world was captivated by visions of Neal Stephenson's Snow Crash come to life. The virtual world isn't a game--it's a venue, a platform, a plot of undeveloped land, a blank canvas, an open world. Users make of it what they will. ...

But that was nearly 10 years ago. I wondered: what happened to all of those buildings? Were people still making use of them? So I logged in. The world of Second Life, it turns out, is not abandoned. Estimates put the current active user-base around 600,000 members; in its heyday, it boasted between 60 and 80 thousand simultaneous logins. There are often a handful of people in most of the spaces you'll visit, but it's easy to find privacy. Here and there are signs that point to its lack of people: "space for rent", "band wanted." But the sheer variety of environments, and the obvious care that people put into them, remains stunning.

Five NSA programs you should know by name

by
in security on (#3QM)
story imageYou may be sick and tired of hearing about NSA surveillance, but we may as well get used to it: until legislators decide to put an end to the mass surveillance, they're here to stay. Rather than ignore them then, better to get to know them. RadioOpenSource has provided an excellent overview of five NSA programs currently in force that you should know by name.
  1. XKeyscore
  2. Fascia
  3. Optic Nerve
  4. Boundless Informant
  5. Dishfire
Even the names are evocative of the kinds of things and the sense of authority and accountability that led to their development. An excellent read. While you go through it, see if you can suggest some names for other NSA programs that would operate in the same vein. For example: "Operation Colonoscopy."

Monday poll: moderation schemes I like

by
in pipedot on (#3QK)
Today's Monday poll looks at moderation schemes. No other aspect of a site so determines its "feel" than the user's ability to comment and for those comments to lead to conversation. Get it right and you've got a great discussion on your hands. But get it wrong and the "right" comments lead to group think, the trolls and kooks take it over, or the place becomes a giant flamewar.

I personally think no site has gotten it just right yet. But we began an interesting conversation about it on this Pipedot article.

There are a lot of models out there, and some of them overlap a bit. OSNews.com's moderation scheme for example is pretty close to Slashdot's, although it gives +1 points for funny. A lot of sites running on modified Drupal or Joomla systems don't even deal with moderation: just provide your comment and it goes on the list, though the site admins have a right to nuke anything offensive to corporate powers, and there's no threading. There's also the Usenet/killfile model, where users decide individuals (not posts) get karma [ed. note: I should've added that to the poll, dang it].

Have your say at the poll to the right. It's a Borda Count, so give "1" to the system you like best, a "2" to the one you like a bit less, and so on.

Emails from Pixar's Catmull Revealed in Silicon Valley Anti-Poaching Lawsuit

by
Anonymous Coward
in legal on (#3QJ)
story imageEd Catmull is legendary in the fields of computer graphics and animation; he was an important researcher in 3D computer graphics in the '70s, became head of Lucasfilm's Pixar computer animation division, and has essentially remained in that role ever since through the sale/spinoff of Pixar to Steve Jobs in 1986, its years as an independent producer of feature-length animated films, and its acquisition by Disney in 2006. He's just published a book on creative leadership.

While Catmull has lots of fans in Silicon Valley and beyond, he's emerging as a key figure in an antitrust lawsuit by employees over the 'gentlemen's agreement' by a handful of companies including Apple, Google, Intel, and Pixar, to avoid recruiting each other's employees, thus avoiding a bidding war on talent. Emails recovered during the discovery phase of an ongoing class action lawsuit reveal that Catmull was a zealous enforcer of the pact among digital animation studios, including Pixar, Lucasfilm/ILM, and Dreamworks; at one point, after Pixar was acquired by Disney, he even wrote an email persuading Disney Studios Chairman Dick Cook, to put the arm on a sister Disney studio that was poaching Dreamwork employees:
I know that Zemeckis' company will not target Pixar, however, by offering higher salaries to grow at the rate they desire, people will hear about it and leave. We have avoided wars up in Norther[n] California because all of the companies up here - Pixar, ILM [Lucasfilm], Dreamworks, and couple of smaller places [sic]- have conscientiously avoided raiding each other.
The Catmull emails also reveal that Sony was recruited to join the pact/cartel, but Sony refused to play ball. This seemed to raise Catmull's testosterone level a bit. Catmull to Cook again:
Just this last week, we did have a recruiter working for ILM [Lucasfilm] approach some of our people. We called to complain and the recruiter immediately stopped. This kind of relationship has helped keep the peace in the Bay Area and it is important that we continue to use restraint.

Now that Sony has announced their intentions with regard to selling part of their special effects business, and given Sony's extremely poor behavior in its recruiting practices, I would feel very good about aggressively going after Sony people.
In the deposition, Catmull said he never followed through with the threat to go after Sony's employees.

(I saw this story on OSNews, which drew a fair number of comments).

Maybe Runaway wasn't so far fetched after all...

by
in science on (#3QH)
That the 1984 movie 'Runaway' with Tom Selleck was a lackluster performer is undisputed. However, the movie did show an interesting tech. Bullets that could home in on a target and change their course mid-flight. At the time these mini missiles seemed a bit over the top. The era of such miniaturization and processing speed were still the realm of science fiction.

Well, apparently DARPA was paying attention. The have released footage of their Extreme Accuracy Tasked Ordnance (EXACTO) program in action.

These would be truly scary to go up against on the battlefield. .50 caliber bullets being fired at you is one thing, having them chase you down is quite another.
"This video shows EXACTO rounds maneuvering in flight to hit targets that are offset from where the sniper rifle is aimed. EXACTO's specially designed ammunition and real-time optical guidance system help track and direct projectiles to their targets by compensating for weather, wind, target movement and other factors that could impede successful hits."
The video shows the ordinance making quite a severe course correction before striking its target.

The full program information is here.

With the ability to turn anyone into a sniper, what does this bode for the future battlefield? Will these inevitably end up in militarized police usage?

Unikernels: rise of the virtual-library operating system

by
in code on (#3QG)
Mssrs. Anil Madhavapeddy and David J. Scott over at the Association for Computing Machinery (ACM) ask: What if all layers in a virtual appliance were compiled within the same safe, high-level language framework? Good question, and I suspect we'll find out soon enough, because the trend in virtualization seems to be leading us in this direction.
While operating-system virtualization is undeniably useful, it adds yet another layer to an already highly layered software stack now including: support for old physical protocols (e.g., disk standards developed in the 1980s, such as IDE); irrelevant optimizations (e.g., disk elevator algorithms on SSD drives); backward-compatible interfaces (e.g., POSIX); user-space processes and threads (in addition to VMs on a hypervisor); and managed-code runtimes (e.g., OCaml, .NET, or Java). All of these layers sit beneath the application code. Are we really doomed to adding new layers of indirection and abstraction every few years, leaving future generations of programmers to become virtual archaeologists as they dig through hundreds of layers of software emulation to debug even the simplest applications?
This project intends to reduce the different layers of software and operating system to simple-API systems that can be installed and used like virtual appliances, perhaps [ed. note: this is my analogy, not the author's] the way Busybox reduces the POSIX standard to a simpler and smaller binary executable.

The Post-Silicon future

by
in hardware on (#3QF)
story imageIt's hard to understate the impact of the silicon chip and the advances wrought by Moore's law and steady research and development in miniaturizing transistors on silicon. But we're getting close to the limits, and getting beyond 9nm puts us at the limits of physics, it would seem. So what's next?

IBM is hoping it will be the first to find out. Not one to shy away from the big bets, IBM is putting $3B into researching the next step, including having a go at 7nm processes, new semiconducting materials like Gallium-Arsenide, and technologies like carbon nanotubes and graphene. At 7nm, the game changes significantly, and quantum physics begins to matter as much as traditional physics. Steve Torbak points out there's hope for technologies like racetrack memory and neuromorphic memory, too.

Or maybe, there's still room for improvement with what we've got. We're not done with Systems-on-a-Chip, after all, and DARPA has recently taken this approach to put an entire communications stack on a dime-sized chip.

[Ed. note: All I know is, to watch the next generation of silly cat videos, we're going to need a serious boost in hardware. /grin]
...62636465666768697071...