Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Operating Systems Software Windows IT Technology

How Microsoft Dropped the Ball With Developers 814

cremou writes "As part of an Ars Technica series on how one developer migrated from Windows to OS X (and why), this second article concentrates on how Microsoft bungled the transition from XP to Vista. The author looks at some unfortunate decisions Microsoft made that have made Windows an unpleasant development platform. 'So Windows is just a disaster to write programs for. It's miserable. It's quite nice if you want to use the same techniques you learned 15 years ago and not bother to change how you do, well, anything, but for anyone else it's all pain... And it's not just third parties who suffer. It causes trouble for Microsoft, too. The code isn't just inconsistent and ugly on the outside; it's that way on the inside, too. There's a lot of software for Windows, a lot of business-critical software, that's not maintained any more. And that software is usually buggy. It passes bad parameters to API calls, uses memory that it has released, assumes that files live in particular hard-coded locations, all sorts of things that it shouldn't do.'"
This discussion has been archived. No new comments can be posted.

How Microsoft Dropped the Ball With Developers

Comments Filter:
  • by erroneus ( 253617 ) on Monday May 05, 2008 @08:26PM (#23306612) Homepage
    The culture of DOS programming was corrupted from the beginning and you can partly blame IBM for a crappy BIOS. Were it not for the crappy BIOS, programmers wouldn't have had to resort to writing directly to hardware to get an acceptable speed on the screen. And it just kept going on from there. And now when a developer wants more "something" from the OS than they can get naturally, they write VxDs to help gain an advantage.

    The culture is all about writing code to get past deficiencies and shortcomings in DOS/Windows.

    Windows programmers don't respect the rules... and if they do, they write what appears to be crappy software.
  • by glassware ( 195317 ) on Monday May 05, 2008 @09:00PM (#23306894) Homepage Journal
    I want to second this concept. Back in 1998, when I started a company of my own, I insisted that my partners and I purchase a $500 MSDN license so we could do current development on Microsoft platforms.

    In 2004, when I joined a company that was well funded by venture capitalists, they required that I cost-justify the $2000 MSDN license cost. I argued that we were developing consumer applications and we needed the license.

    In 2007, I can no longer justify $3500ish for MSDN. It just doesn't work anymore. They offer reduced versions of MSDN, each of which eliminates all the reasons why a person would subscribe to MSDN. They offer only 10 application installs for your $3500. They offer only a few OS installs. After you've installed a few, they stop letting you install more development copies and insist that you call them for more authorization. It just doesn't work anymore, and I'm sad because I really liked being able to develop code without artificial roadblocks in my path.
  • by dreamchaser ( 49529 ) on Monday May 05, 2008 @09:05PM (#23306942) Homepage Journal
    No I didn't miss the point. Using an undocumented API is another example of bad programming. Yes, even HAVING undocumented API's is bad as well. Like I said, I was not excusing the mess that is Win32, I was just sayin'...
  • by Opportunist ( 166417 ) on Monday May 05, 2008 @09:44PM (#23307262)
    "When you use undocumented API calls you're in the wrong".

    So far, so right. In theory.

    In practice, though, let's look back into the world of Windows at its beginning. We're in the middle of the 90s, Win95 is fresh out the door and you're supposed to write for it... erh ... well, if you can. There are a few API calls documented which will allow you to write a few cute Windows programs but they will invariably be unable to compete with programs developed by MS. Why? Because you have no access to API calls. API functions that make your programs faster, or easier to use, or simply allow you to do something at all. Especially graphics and network code was notorious for being impossible to implement sensibly without resorting to functions that were available only when you dug through disassembled DLLs and guessed what was expected from you.

    So programmers faced the choice: Either write programs that cannot compete with programs written by MS (or companies that somehow got a hold of that information), or use calls where a few parameters are described as "set to NULL" or "unknown function".

    MS has a history of releasing information about formats or calling parameters at trickling speed, at best. Anyone who ever wanted to get a hold on, say, the Office container format can vouch for that. It's not really a lot better for API documentation. Usually, you get it for a lot of money, if you are deemed "worthy" first of all.

    Programmers don't let a company do that to them, though. They start figuring things out, reverse libraries and even existing program code to get the information they need. Of course, this results in the occasional mistake.

    Jump into the present. The companies that created software back then don't exist anymore, blown up in the dot.com bubble. Their software, though, still exists. And companies now rely on this software. So MS has to maintain those "buggy" APIs, else companies running buggy software will refuse to upgrade.

    Who is to blame? Basically, whoever decided that it's a smart idea to withhold the API documentation.
  • by techno-vampire ( 666512 ) on Monday May 05, 2008 @09:53PM (#23307348) Homepage
    I have no idea what hes on about with the hard-pathed file references.


    This goes back even before Windows. I used to have some DOS programs that would only let you save a file to a floppy. Not just games, a poster-creating program had A:> built into the path for saving files, and there was no way to change it. Granted, even the most rabid Microsoft-basher can't blame that on them, but it's part of the way programs used to be written. It's the same type of mindset as caused game designers in the early DOS days to hardcode timing loops because, of course, the PC would always run at 4.77Mhz.

  • by TheNetAvenger ( 624455 ) on Monday May 05, 2008 @10:26PM (#23307614)
    Define "throughout the OS". In Leopard, almost all the libraries come in 32-bit and 64-bit forms; it's just the kernel and a lot of executables that come with the OS that are 32-bit only. 64-bit userland code can talk to 32-bit kernel code just fine.


    Leopard's idea of 64bit is providing a 64bit version of Cocoa for applicaiton development. Like Tiger there are a few OS level 64bit pieces for addressing more RAM than 32bit, but the majority of the OS kernel itself is 32bit still. (Carbon was supposed to be moved to 64bit as well, but Apple gave up on it.)

    So this is why you only get 32bit drivers with OS X, because that is what the OS is. Apple tries to use this as a marketing tool, from their own web site:

    "Leopard is the first universal operating system to seamlessly support both 64-bit and 32-bit applications. There's no need to upgrade drivers, and you can even stick with your existing printers, storage devices, and PCI Express cards." http://www.apple.com/macpro/technology/leopard.html [apple.com]

    However, because you there are areas where 64bits at the OS level DOES help that Apple is not using, it a hybrid at best and a 32bit OS with 64bit application support and a few kernel level *nix 64bit flags to be more generic.

    For example, one area where Apple would benefit from full 64bits would be in Video Drivers, as shoving data to GPUs in 64bit chunks is much more efficient and faster. (There are other areas, that can be seen if you contrast Vista x64 vs Vista x32, as Vistax64 with 2GB of RAM runs even 32bit applications 5-15% faster, especially gaming applications because of the 64bit driver support.)

    Unlike a lot of fanbois would like people to believe 64bit is not slower, even if it is allocating or shoving twice the bits around, as the other benefits of the 64bit CPUs of today more than make up for it, just like the process scheduler in the 386 was worth the 16-32bit jump, in addition to additional registers less internal tracking and paging, etc.

  • Re:Inconsistencies (Score:2, Informative)

    by icydog ( 923695 ) on Monday May 05, 2008 @10:36PM (#23307680) Homepage
    The author is not saying the third type of developer is unimportant; rather, he is saying the opposite. He downplays their importance in a tongue-in-cheek manner, mocking Microsoft. If the author were to categorize himself as one of the three types of developers, he would probably associate with the third type.
  • by Anonymous Coward on Monday May 05, 2008 @10:44PM (#23307746)
    You have a lot of things wrong about .NET.

    C++ is not a subset. You can take any C++ project that exists, flip a compiler flag, and it will be compiled for .NET. There are no limitations. The runtime permits the seamless integration of IL and native code. What can be compiled to IL will be compiled to IL and everything else will be native. Microsoft proved this back in 2003 when they took the public source for the Quake II engine and recompiled it in Managed C++, then added a .NET plugin model and a C# radar.

    The CLR is fully capable of working with dynamic languages. There are quite a few that are already out and work fine. The DLR doesn't add anything new to the underlying runtime to better support dynamic languages. What the DLR strives to provide is a consistent pluggable API for hosting dynamic languages. Currently, if you wanted to host IronPython, IronRuby, Boo, PowerShell, JScript, etc., you would have to use a language specific host and bindings. The DLR will make it more predictable, like how CodeDom currently allows you to work with the various static languages by using the same object model.

    The power is in the libraries, but the libraries are consumable by any CLI-compatible language, or any native language hosting the runtime.

    The .NET runtime, from day one, was designed to be a very generic system. The one mistake was that generics was not in v1.0 and that led to a split between some portions of the runtime. That was the only time the underlying VM had to change. Every feature since has been nothing more than a library, or compiler candy. An example is that .NET 1.0 CIL had a tailcall opcode. Only functional languages that use recursion for looping would require such an opcode.
  • by TheNetAvenger ( 624455 ) on Monday May 05, 2008 @10:45PM (#23307750)
    OK, how may Microsoft applications use WPF/WCF ?

    Several... Should we start with things like Silverlight, or go into the whole WDDM model of Vista that uses XAML from the composer to printing? You are trying to confuse .NET 3.0 applicaitons here, and you are off track.

    Mac OS X's low level graphics APIs are called Quartz and OpenGL. Quartz is effectively Display PDF. Display Postscript sadly died in Apple's hands.

    Ok, OpenGL is not Apple's any more than it is Microsoft's.

    Display PDF is equivalent to GDI+ from Windows 2000, go look it up. Additionally, Apple's implementation of Display PDF even lacks the full specification, which is sadly dated anyway.

    In terms of Alpha, transparency, layering, etc, Display PDF ends up rendering to a bitmap on complex drawings instead of being able to natively draw them using the language of Display PDF.

    Go to Channel10, and watch the video of WHY XAML/XPS was developed, how it was created to specifically overcome the limitations and conceptual drawing limitations of Display PDF and Full PDF. (This is why Printing Press companies are starting to use XPS, because it can reproduce images at higher quality without having to full rasterize the image as many do now. i.e. A lot of PDF printing is rasterized and is nothing but a container for the bitmap because PDF cannot do the advanced drawing.)

    Take a look at Core Image, Core Animation, and Quartz Composer, and even venerable Quicktime to see where MS got the ideas they imitated badly. Then look into Interface Builder nib files which have provided more than XAML capabilities (on the desk top) since October 1988. It took MS 20 years to copy that one.


    First Apple's 'Core' crap is nothing like the new API features or architectual changes in Vista. Core is more about using SSE from the CPU and offers very little new features.

    You also need to get out of 'fanboi' mode and check your timelines here a bit. You are trying to compare XML document structures with XAML that uses a XML structure. However, how XAML is stored is irrelevant, it is how it processes graphics, allows for advanced drawing capabilities, internal binding, animation properties, and 3D drawing - in addition to providing a new UI paradigm of control contexts. (Again you really don't know what you are talking about here.)

    When you can take a text editor and write 10 lines of XML and create a 3D scene with a movie and UI controls on OS X get back to me, until then, Microsoft has set the stage for the next generation of development, especially when it comes to graphic designers becoming part of the design process.

    Did you notice that Windows 95 GUI was a rip-off of the NeXTstep GUI but copied poorly. You have your history wrong.


    Actually it wasn't, although many GUIs from this time frame shared a lot of ideas. Win95 was a small subset of Microsoft's development of the OS/2 Object based UI that goes back to 1987/1988.

    But when I was talking about IBM and Microsoft writing UI guidelines, I was specifically talking about UI specifications and standardized UI guidelines that defined an era of UI for OSes. Go back and look at the UI guideline papers from OS/2 written by IBM and Microsoft in the 80s, and how even Windows stuck to a version of Common UI guidelines. Additionally, as Apple struggled to implement keyboard support, they took from these Common UI guildelines as well.

    You are either too young to know this stuff or just too much of a Mac OMG person to even consider Apple didn't invent everything.

    I'll ignore the rest of the nonsense. Suffice it to say that OS X runs 32 bit and 64 bit applications side by side. How does that work in 4 bit Windows?

    Perfectly, in fact running Vista x64 on this 2005 laptop, and all my 32bit applications run fine, especially my games that get a punch up in performance because Vista has real x64 bit support and REAL 64bit Video drivers, unlike OS X.

    Do you really think Vista x64 bit can't run 32bit applications? Are you that out of touch? Holy crap...
  • by Guy Harris ( 3803 ) <guy@alum.mit.edu> on Monday May 05, 2008 @10:55PM (#23307824)

    Leopard's idea of 64bit is providing a 64bit version of Cocoa for applicaiton development.

    Leopard's idea of 64-bit is providing 64-bit versions of most frameworks and libraries for application development. The one big exception is Carbon, and, yes, that's a big exception, but it's not as if there's not much you can do in 64-bit mode (as was the case in Tiger).

    Like Tiger there are a few OS level 64bit pieces for addressing more RAM than 32bit, but the majority of the OS kernel itself is 32bit still.

    Yes, as I said in the posting to which you're replying. I also said, however, "64-bit userland code can talk to 32-bit kernel code just fine." There might be performance boosts from running in 64-bit mode in the kernel, but it's not as if you'd gain much in the way of application functionality, as opposed to application performance, from having a 64-bit kernel.

  • by Silver Gryphon ( 928672 ) on Monday May 05, 2008 @11:05PM (#23307902)
    I agree, $3500 is insane for MSDN (I cap its value at $2,000), and I think the Premium Uber MSDN with Team System costs like $11,000. And the Express editions of Studio just don't cut it for a lot of people; no source control or unit testing, etc. Still, there's a middle ground:

    Microsoft Certified Partners are entitled to a certain number of MSDN subscriptions and/or Visual Studio copies, depending on their partner level. Even as just a Registered Partner (anyone can get this simply by signing up for free), there's something called an Action Pack, I think, that includes enough licenses to get a small business running - server OS, SQL Server, etc. The Action Pack costs either $200 or $400, and I'm too lazy to verify that but here's the link if you're still interested:

    http://partner.microsoft.com/ [microsoft.com]

    Getting Certified Partner status isn't a big hurdle; get some customer references and prove certain technologies are within your scope and you're well on your way.

  • by Gazzonyx ( 982402 ) <scott,lovenberg&gmail,com> on Monday May 05, 2008 @11:07PM (#23307912)
    Just wondering... did your program interface a database at all? You should see the regressions with DAO/ADO/ODBC/JET/Etc. Now all you've got is ADO.NET, and from what I've seen the calls aren't the same as the calls for the pre-.net APIs.
  • by TheNetAvenger ( 624455 ) on Monday May 05, 2008 @11:35PM (#23308128)
    What is .NET? Every time I've looked it up I've gotten a different answer

    I completely agree that Microsoft and the .NET naming period was insane and poor marketing. .NET is several things, from a managed framework to extended technology ideas, to even API sets.

    Generally it is an application framework that is rich, managed (apps easier to write and won't suck memory as it cleans itself up), and programming agnostic. So if you write in C, C#, Pascal, VB, Python, even ASP or PHP, you can write .NET based applications that can be stand alone programs, web applications, HTML pre-processors, Games, virtually anything, even Silverlight is a variant of .NET technology. It is also highly extensive, as the Vista APIs were added to .NET seamlessly without even changing the core components of the .NET framework. This gave .NET the ability to do XAML, 3D, etc overnight.

    It is kind of complex, so this is a very generic description, and hope it helps a bit. .NET has some advanced stuff no one else is doing, and there are also some things that suck like all platform/frameworks.

  • by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Monday May 05, 2008 @11:49PM (#23308212) Homepage
    I worked at a computer manufacturor in the late 70's and early 80s in LA and started on 8-bit micros and was there for ther introduction of 16 bit chips.

    I was used to PDP-11s keep in mind.

    The problem wasn't the 68000 wasn't ready, it was ready. There were just no support chips yet. Intel actually delivered a complete solution: CTC chips, PIC's, serial ports, dma controllers, i/o processors (that nobody but us used).

    Motorola had a CPU and that's it. A vastly *superior* CPU, but the hardware guys wanted to build systems not wait for the rest of the stuff they needed. So we all held our noses and went x86. And bought Amigas as soon as they were out (I have serial #11. Still.)

    This crap as all in one chip these days, but back then computers had several large black chips inside them.
  • by Anonymous Coward on Tuesday May 06, 2008 @12:00AM (#23308274)
    > Yes, even HAVING undocumented API's is bad as well. Like I said, I was not excusing the mess that is Win32, I was just sayin'...

    NO! He did NOT say they use undocumented APIs! He said that the *official* APIs have undocumented gotchas and quirks. That's NOT the same thing at all!

    For example, I seem to recall that Vista has some undocumented extras for any application named setup.exe. Now imagine that, but in the API, perhaps if CreateProcess() was doing weird stuff to your application's memory just because you passed certain, specific values that happened to be the same as a crazy legacy application. That's exactly the sort of stuff that can cause problems at the worst possible time.
  • by LordMyren ( 15499 ) on Tuesday May 06, 2008 @12:24AM (#23308412) Homepage
    Sure dude whatever.

    First off, Microsoft never claimed .net was portable, just the CLR. And, for the record, the first reference implementation of the CLR -- Rotor -- was cross platform to BSD. Mono came along of its own volition and works independently of MS, and MS has never made any claims on their behalf. Mono has had very good 2.0 compliance for going on three now. Library support is excellent but not perfect, which makes sense given that the .Net library is a massive all encompassing + the kitchen sink beast and there will always be pieces no one has ported. That wont prevent 99.99% of apps from working.

    Rather than FUD'ing around, I suggest downloading the Moma tool and which will check whether an application is compatible with Mono.

    Mono supported the DLR by day five after release or something. C# 3.0 support is on the way, and, to my understanding, most of the pending work is still in LINQ re-implementation world. Given that it was released November 19, 2007 and that it requires implementing a huge AST, I wouldnt complain.

    At its base, the CLR is a wonderful VM to implement and write to. The libraries built on CLR are give or take. For the most part, I prefer the non MS ones anyways.
  • Qt (& GNOME) (Score:5, Informative)

    by scorp1us ( 235526 ) on Tuesday May 06, 2008 @01:32AM (#23308784) Journal
    As someone who had to learn C++/CLI and writes code to allow legacy code to interop with C# at work, I have this to say.

    If you are going to learn a new platform for a "modern" app or OS, then let it be one that allows you to target more than one platform. Seriously. Lets take a look at .NET:
    - Everything in the library is new.
    - You can only officially target one platform. (Mono not withstanding)
    - You have to learn a new language to use it effectively.

    Now look at Qt:
    - New library
    - Build onto same C++ compiler you've always used
    - No messy COM, COM wrappers needed for introspection
    - You can target any platform with a modern C++ compiler (VS6 and higher on win32, gcc on all platforms)
    - Ground up C++, clean consistent API.
    - Active development with binary compatibility within major releases.
    - Python, ECMA scripting, (some C# support too!)
    - Java version
    - Meta-object compiler adds introspection. (no need to deal with COM)
    - ActiveX interop in the commercial version (You can use Qt widgets in Winforms and vice-versa)

    I don't know as much about GNOME, but it shares a lot with Qt, so should not be excluded.

    About the only thing you miss out on is the automatic garbage collector. Qt emulates this to some degree by allowing every QObject to have a parent. Then the only thing missing is the ability to defragment memory in the heap. I've only heard about this being caused by lots of small memory allocations, but Qt block allocates so this isn't a problem. Also, many types are implicitly shared, meaning they are more like handles to the objects, meaning that 1) they can cross thread boundaries 2) they are references until they are modified.

    All in all I see you only lose out on the memory defrag. But you don't need to learn C++/CLI or C#. (My opinion of C# is that if you're going to go that far, you might as well take the goals of the language to completion, in which case you end up with Python, oh yeah, there is a Python wrapper for Qt too)

  • Re:Long Answer? (Score:3, Informative)

    by CodeBuster ( 516420 ) on Tuesday May 06, 2008 @02:07AM (#23308952)
    I will grant you that ASP.NET has been a sore point, but they are finally starting to see the light with the ASP.NET 3.5 Extensions and the Model View Controller [asp.net] option for ASP.NET development. Scott Gu has an entire blog article series (complete with example project) covering the ASP.NET MVC Framework and it really is going in the right direction. I know that the MVC idea is not original and it certainly wasn't invented at Microsoft, but immitation is the sincerst form of flattery as they say and Microsoft is finally seeing the light on this one.

    For as much as I like programming in C#, I just can't see myself using it very often, as it is just not very well suited to the kinds of work that I do most, particularly web development, where I use mostly JavaScript and whatever server side language happens to be convenient
    The risk there is that you are reinventing the wheel with JavaScript DOM code and other plumbing that have been done thousands of times before by countless other developers and, lets face it, probably done better. I wouldn't trust myself to write the best general purpose JavaScript libraries either and I wouldn't try, especially when I know that someone else has already done that. A really great programming environment frees one from having to be concerned with plumbing issues, that while interesting on some level, are not generally directly relevant to the problems at hand (i.e. business logic and what the site should actually be doing).

    and low level server architecture, which I mostly do in C and occasionally C++

    Reminds me of the CGI days which were really quite primitive compared with web applications that can be built today with the much better tools. Perhaps you are one of the ellusive jedi masters of C and C++, but I think it is fair to say that C or C++ are really innapropriate choices for the majority of web developers these days.

    As for the absolute performance of ASP.NET vs other possible approaches I am not surprised that ASP.NET doesn't win the speed race, but really, how many of us run sites like Yahoo where that last ounce of performance and speed is really worth the price in terms of low level complexity and micro optimization required to achieve it? ASP.NET is an appropriate and reasonable choice for the vast majority of web application projects that most developers will be asked to build.

  • by Brandybuck ( 704397 ) on Tuesday May 06, 2008 @02:31AM (#23309026) Homepage Journal
    Most of the flashy Microsoft tools are there to correct the horrible APIs they have. For example. it's nearly impossible to write MFC code without the Class Wizard. But with Qt all you need is a text editor and compiler. Your *choice* of text editor and compiler. Even Visual Studio.
  • Re:Long Answer? (Score:2, Informative)

    by radio4fan ( 304271 ) on Tuesday May 06, 2008 @03:16AM (#23309244)

    Couldn't they even keep backwards compatibility via virtualization?
    Yes, that seems like the way to make real progress to me.

    And it's exactly how Apple supported OS9 when OSX came out.

    It worked very well for the vast majority of programs.
  • Re:Long Answer? (Score:5, Informative)

    by arkhan_jg ( 618674 ) on Tuesday May 06, 2008 @03:24AM (#23309280)
    This is theoretically the plan with windows 7. A new, clean minimal and modular OS based on the server line, without binary compatibility for old apps, with a new API for the new OS. Instead, there will be a separate backwards compatible API for a set of monolithic libraries providing all the old functions - same principle as Classic on OSX. Old apps will run as before, but through a compatibility layer to the new OS, while apps can be recompiled to talk directly to the new API, and presumably take advantage.

    IE's rendering engine can go in the legacy libraries for old apps, for example, while being a modular component that's fully removable in the new OS (thus keeping the EU competition comissioner happy)

    That's the theory anyway. Whether MS manage to pull it off is another question.

  • Re:Long Answer? (Score:5, Informative)

    by vipw ( 228 ) on Tuesday May 06, 2008 @04:23AM (#23309580)
    Apple IS abandoning Carbon. There will be no 64-bit version of it http://arstechnica.com/staff/fatbits.ars/2008/04/02/rhapsody-and-blues [arstechnica.com], so no one is going to be using it before long. Compare this to the current state of .NET, where developers have to constantly mix in win32 calls to do anything but the most basic applications. My own personal experience with .NET is only a few months, but I have had to use Win32 API a lot.

    And NeXTStep is a magical, shiny, new API compared to Win32, which is the biggest mess I've ever seen. Admittedly, I'm used to simpler systems like UNIX.
  • Re:Long Answer? (Score:5, Informative)

    by jeremyp ( 130771 ) on Tuesday May 06, 2008 @08:04AM (#23310470) Homepage Journal

    So let's hear it ... is xcode a real competitor to visual studio ?

    No it's not. I've been using Xcode for several years and I still haven't figured out how to build a Windows application with it. Conversely, Visual Studio cannot be used to build a Macintosh application either. The two IDEs are used for different tasks so they can never be competitors.

    Is Xcode as good for developing Mac applications as Visual Studio is for Windows applications. I think probably not. It has some superior features (NB the most recent version of VS I have used is 2005 so my knowledge may be out of date). It has some features I like better than VS but that''s probably because I prefer the Mac environment. It has some features that are worse than VS. The debugger is the worst of these IMHO.

    On balance Visual Studio is a far better developer environment, but that counts for nothing if you are trying to write a Mac GUI application, in which case Xcode is the only game in town as far as I know.

  • Re:Long Answer? (Score:3, Informative)

    by vipw ( 228 ) on Tuesday May 06, 2008 @11:26AM (#23312552)
    An example is making the sort arrow show up in a listview control. http://www.thebitguru.com/articles/16-How+to+Set+ListView+Column+Header+Sort+Icons+in+C%23 [thebitguru.com]

    But I've had far more fun with PostMessage and friends.
  • Re:Long Answer? (Score:4, Informative)

    by VoidEngineer ( 633446 ) on Tuesday May 06, 2008 @12:31PM (#23313402)
    I'm not going to go through your entire to-do list, although I will answer a couple of your questions.

    Except for to confuse me, why do ICollections have a Count, when arrays have a Length? What's the meaningful semantic distinction that I'm missing here? ICollections implement an IEnumerator interface and have an Enumerator object which counts the objects in a collection. Think of an enumerator as an odometer (like the one in your car). If the object implements ICollection, then it has an odometer, which you can get the count from. Arrays just have size.

    Why do arrays have LongLength but with no corresponding LongCount for ICollections (3.5 adds LongCount as an extension method, but that gives it inconsistent (method) syntax, and of course it can never work because even if the collection were extended to support long lengths, an extension method can never exploit that fact, because the extension just works on IEnumerable which supports only an int Count)?
    Again, difference between an attribute and a verb. LongLength is an attribute of an object, such as height or width. Whereas Count is like an odometer in your car. Your question is kind of like asking why can you travel tens of thousands of miles along the US highway, but my odometer in my car only goes up to 1000. Well, that's just the way the odometer was made. Length of trip is distance, whereas your odometer is a counter and is only a *measure* of distance. It's simply a distinction the language makes.

    Why do arrays have LongLength, and not simply have Length be a long? It surely didn't take a whole lot of foresight to figure that one out, did it?
    Chunking. .NET gets used on both 32 and 64 bit platforms, and the performance penalty for splitting a 64 bit word into two is greater than using two 32 bit words. In the first case, you still have to use 64 bit words, but you pad the first 32 bits with zeros, and convert to 32 bit words. Requires an extra pass through the processor to calculate, whereas adding two 32 bit words into a 64 bit word is trivial. I'm not explaining this concept well, but if you look it up you'll find more info on the question you're asking. The design decision was based on current market saturation of 32 bit processors, and the LongLength was an added conversion for the 64bit programmers.

    Where do I find reverse iteration/enumeration? Where do I find bidirectional iteration/enumeration?
    Using the odometer analogy as above, the enumerator only goes forward; although you *can* reset it. If you want to do reverse iteration, copy your collection into a new collections backwards, and iterate over that new object. Alternatively, for most reverse or bidirectional iterations, you'll simply want to ditch the 'foreach' loops, and use a simple 'for' loop. Then you can start high and use decrement iterators to count down. I also like to use decrementor collections which get an object removed with each pass of the for loop.

    Why is there no generalized mechanism for storing my position in a container?
    You need to get the IEnumerator object from the ICollection, using the GetEnumerator method. It will have a Position field.

    Why is there no SortedSet?
    Probably implemented somewhere else.

    Why are there no static type-inferencing factories for read-only collections or singleton collections? When you have generics, factory methods are good, because factory methods can infer. Don't make me type IList myReadOnlyList = new ReadOnlyCollection(myList); the double specification of the type is spurious.
    You're splitting hairs here. It's a strongly typed language. It's meant to be explicit, not inferential. In other projects, besides yours, double specifying types is needed and a useful (if not critical) feature.

    Why is it named ReadOnlyCollection when it is in fact a IList? Why is there no true ReadOnlyCollection (i.e. that is actually useful for collections)? Why is it na

Life is a whim of several billion cells to be you for a while.

Working...