Apple shifts from Objective C to Swift

by
in apple on (#3NC)
story imageApple announced a new programming language yesterday at its yearly developer conference. With improvements in speed and ease of development, the new language aims to replace Objective C, Apple's previous language of choice.

As usual, software development in the new language is limited to the company's XCode programming IDE available for no cost in OS X.

Re: Tragic NIH Syndrome (Score: 2, Interesting)

by genkernel@pipedot.org on 2014-06-05 15:57 (#20V)

While perhaps this is just NIH syndrome, I think we *need* new programming languages to replace the ones we already have. And especially new languages that can replace C.

I largely agree with the C/Python language stack. But C isn't by any means perfect and I feel that some things could really benefit from some re-thinking. C is good, but if thousands and millions of hours are being spent coding in it, even small improvements are worth a lot. Objective-C seemed to me like a good thing, except for being limited to apple-products, because adding classes to C is both simple and powerful. For this reason I've been telling myself I need to check out D for some time. C has a great theme, and I doubt I'll give it up for some time, but I admantly believe people can do better.

And lets not forget ADA, a language that as far as I can tell has never seen its equal. It may not always be what you want, but the sheer power of compile-time checking in that language is amazingly useful. Honestly I think a language like this really needs to come back for coding mission-critical software. Oh how many failures could have been prevented if only people used a tool like ADA! Unfortunately, ADA just isn't as useful as other languages atm, for sheer lack of library and community support.

In short, when programming languages are being used so widely and when so much time is being spent using them, it is completely unreasonable to *not* look for improvements. Improvements in performance (though at the moment that isn't as much of an issue), improvements in error checking and robustness, and improvements in ease of use for faster development time. Even small improvements in any of these can be so valuable. With that in mind, does it really seem so strange that companies on the cutting edge would look to take advantage of their position to try to improve their own abillity to develop programs?
Post Comment
Subject
Comment
Captcha
What is Thomas' name?