Atom now available on Windows

by
in code on (#3QB)
If you haven't heard of Atom already, now's a good chance to get acquainted. It's GitHub's open source editor, and it's pretty awesome. The developers behind it write:
At GitHub, we're building the text editor we've always wanted. A tool you can customize to do anything, but also use productively on the first day without ever touching a config file. Atom is modern, approachable, and hackable to the core. We can't wait to see what you build with it.
It's different from traditional text editors in a couple of important ways, including a web-based core and Node.js integration. Atom is "A hackable text editor for the 21st Century." It is built on node and chromium and is very easy to extend and customize. Best of all, it is now available on Windows.

I have been using it on OS X for several months and like it a lot. It is great for ruby, python, html, etc. One of its few shortcomings is that it really isn't great for editing very large text files - megabytes of logs, for example. It's been available for Mac OSX for a while already. And for those linux users who do not want to wait for an official release, there is a build howto here.

Curious, or ready to start coding? Here are five tips for getting started.

Re: i don't understand (Score: 5, Insightful)

by genx@pipedot.org on 2014-07-11 09:56 (#2EX)

This kind of reasoning has been flourishing during the last years, and it would be very fine if one were only using one program. But it does not scale. I have perhaps 30 programs running at the same time: many terminals, many PDF viewers, several text editors, 1 IDE, 1 web browser, 1 mail agent, 1 music player and many more. Just imagine when each of these 30 pieces of software starts requiring 10 or 20 times the memory and power that they should need.

And I do not want to pay n x 20$ to add memory modules, supposing my motherboard supports it, and I do not want to buy a new computer (or MB+proc+new kind of RAM), just to achieve roughly the same functionnalities I could achieve before (sometimes a bit more, sometimes a bit less, generally little more than a fashionable change of appearance or engine, not an astounding improvement). All right, it works fine on the developper's machine which he upgraded, but because of his laziness / lack of care, it will be millions of users that will have to buy more RAM, millions of $ spent for this.

One a side note, I do agree that having a good computers is a lot of comfort for development work, but there is a huge drawback: then developpers totally lose contact with reality, the reality of computers users will (try to) run this program on. If they had average computers, they would be aware, care a bit more about what they do. I do not mean they should ultra-optimise everything; just avoiding bloat would be a start. You cannot produce good quality software if you have a bleeding edge setup, you will not see or feel many of the problems; and a 1 hours test on a small computer at release time is not enough to experience them. The laziness and convenience of the developper is quite opposed to the convenience of the end user.

I had to use recently a program by a major chipmaker that was hardly doing more than just displaying folding lists with checkboxes. The thing took several hundred Mo when lauched and was as slow as can be imagined (click on a checkbox -> wait 3 seconds before the box is checked)! And, to add insult to injury, with such "good" programming, it started leaking memory like hell. I was not a memory leak, it was a memory flood: after 2 hours, it would eat an extra Go of RAM. Oh, on a powerful computer, it could be run. The programmer likely had such a computer and did not even notice these major troubles.
Post Comment
Subject
Comment
Captcha
What is 10 - four?