Comment 2F4 Re: i don't understand

Story

Atom now available on Windows

Preview

i don't understand (Score: 3, Insightful)

by pete@pipedot.org on 2014-07-11 00:35 (#2ER)

I don't understand how these projects balloon to such sizes - 169M extracted...its just a fancy text editor. and while only a poor comparison, (due to no windowed gui) VIM is 2.1M, and asfaik does way more, learning-curves aside. Or how just opening it with no files, it spawns 6 processes consuming ~35-40M each. I know its alpha, but i doubt you'll see it shrink considerable in size, if not grow larger. I've been seeing more and more of this lately, ginormous programs that offer little in proportion.

its just a text editor. that kind of size says to me that something went wrong in the design process. Have I just missed out too much on modern development realities? or is this just another convoluted/lazy coding project? (no offense to any developers, its not personal - groupthink has its ways....)

Re: i don't understand (Score: 1)

by kwerle@pipedot.org on 2014-07-11 01:02 (#2ET)

I figure that almost all of that is the chromium base. nodejs is in there, too - but it can't be all that big.

So I guess I'd chalk it up to "lazy" - but I figure that once you cover the stuff you want to show, you probably don't want to go trying to tear out all the stuff you don't need.

As always, if you're worried about memory on your development machine, you're doing something wrong - gigs are practically free. I don't think I'd try to deploy Atom to an embedded system :-)

Re: i don't understand (Score: 5, Insightful)

by genx@pipedot.org on 2014-07-11 09:56 (#2EX)

This kind of reasoning has been flourishing during the last years, and it would be very fine if one were only using one program. But it does not scale. I have perhaps 30 programs running at the same time: many terminals, many PDF viewers, several text editors, 1 IDE, 1 web browser, 1 mail agent, 1 music player and many more. Just imagine when each of these 30 pieces of software starts requiring 10 or 20 times the memory and power that they should need.

And I do not want to pay n x 20$ to add memory modules, supposing my motherboard supports it, and I do not want to buy a new computer (or MB+proc+new kind of RAM), just to achieve roughly the same functionnalities I could achieve before (sometimes a bit more, sometimes a bit less, generally little more than a fashionable change of appearance or engine, not an astounding improvement). All right, it works fine on the developper's machine which he upgraded, but because of his laziness / lack of care, it will be millions of users that will have to buy more RAM, millions of $ spent for this.

One a side note, I do agree that having a good computers is a lot of comfort for development work, but there is a huge drawback: then developpers totally lose contact with reality, the reality of computers users will (try to) run this program on. If they had average computers, they would be aware, care a bit more about what they do. I do not mean they should ultra-optimise everything; just avoiding bloat would be a start. You cannot produce good quality software if you have a bleeding edge setup, you will not see or feel many of the problems; and a 1 hours test on a small computer at release time is not enough to experience them. The laziness and convenience of the developper is quite opposed to the convenience of the end user.

I had to use recently a program by a major chipmaker that was hardly doing more than just displaying folding lists with checkboxes. The thing took several hundred Mo when lauched and was as slow as can be imagined (click on a checkbox -> wait 3 seconds before the box is checked)! And, to add insult to injury, with such "good" programming, it started leaking memory like hell. I was not a memory leak, it was a memory flood: after 2 hours, it would eat an extra Go of RAM. Oh, on a powerful computer, it could be run. The programmer likely had such a computer and did not even notice these major troubles.

Re: i don't understand (Score: 2, Interesting)

by kwerle@pipedot.org on 2014-07-11 15:52 (#2F4)

Your developers should be mindful of the target systems. If they're not, they're doing it wrong. Your testers should have systems with the target specs. If they don't, they're doing it wrong.

Even if you have all those programs running, most of 'em should be [mostly] swapped out while not being actively used. If they're not, you're running the wrong OS.

Computers are so powerful that I do all my development work on a laptop - and it runs like a dream. Even if I had 100 windows open and they all took 50MB of memory and were all fully active all the time, I'd still have several GIGABYTES to do real work in. I mean - just typing that shit blows my mind, because a decade ago that was hard to imagine, and two decades ago it was unfathomable.

When I started in college, emacs was still jokingly said to stand for Eight Megabytes And Constantly Swapping. Because 8 meg could easily cause you to swap - and because emacs was so HUGE!

Those days are gone.

Moderation

Time Reason Points Voter
2014-07-12 19:05 Interesting +1 gallondr00nk@pipedot.org

Junk Status

Not marked as junk