Windows Switch To Git Almost Complete: 8,500 Commits and 1,760 Builds Each Day (arstechnica.com) 221
An anonymous reader quotes a report from Ars Technica: Back in February, Microsoft made the surprising announcement that the Windows development team was going to move to using the open source Git version control system for Windows development. A little over three months after that first revelation, and about 90 percent of the Windows engineering team has made the switch. The Windows repository now has about 4,400 active branches, with 8,500 code pushes made per day and 6,600 code reviews each day. An astonishing 1,760 different Windows builds are made every single day -- more than even the most excitable Windows Insider can handle.
Linus Wins Again (Score:5, Insightful)
Say what you will about Mr. Torvalds, but that magnificent bastard has smacked down many a foe over the years. This is really sweet. If the only thing Linus ever did was to invent git, then that would have been enough. But no, he had to write an operating system besides. When history is written, Linus's inspiration will shine forth from the Pantheon of greats.
Re:Linus Wins Again (Score:5, Funny)
When history is written, Linus's inspiration will shine forth from the Pantheon of greats.
And those historians should take caution, lest they call his operating system "Linux" and are forevermore haunted by Stallman's ghost.
Re:Linus Wins Again (Score:5, Funny)
Stallman's not dead, he just smells that way.
Re: (Score:3)
So Linus essentially helped streamline Windows development. Yeah, what a win.
Re: Linus Wins Again (Score:5, Insightful)
Software is not a zero sum game. Windows' wins are not Linux's losses.
Re: Linus Wins Again (Score:3, Funny)
He helped create the perpetual year of the windows desktop
Re:Linus Wins Again (Score:5, Interesting)
Re:Linus Wins Again (Score:4, Interesting)
Interesting read, actually. They call it GVFS, but it is really just a more asynchronous mode for local git repositories. Traditional git downloads the whole repository as a local copy. It's a feature by design because it allows a developer to work completely offline and only do a git pull when ready to merge branches. It sounds like the Windows repository is a) large and monolithic, so a given developer team does not work on the entire codebase, and b) they frequently sync their changes to the central repository (ie: it is not really decentralized), so the traditional git model has shortcomings for them. One can argue about the structure of the Windows repository, but GVFS sounds like a nice feature to have regardless. The only question I have is...
The third thing the company has done is build a Git proxy server
Why didn't they just clone the Azure repository to somewhere on the East Coast? Git is designed to handle this type of replication, so why did they write a "proxy server" (not entirely sure what they mean by that).
Re: (Score:2)
Why didn't they just clone the Azure repository to somewhere on the East Coast? Git is designed to handle this type of replication, so why did they write a "proxy server" (not entirely sure what they mean by that).
Maybe to avoid merge conflict resolution required when you are decentralized. With a proxy model at least you can guarantee some form of locking. As you pointed out they do not truly embrace the full decentralized nature of git since in a corporate environment it is not appropriate due to the need to lock read and write access to certain sensitive files if only for government compliance reasons. Yes you could split the repository in multiple small chunks that are independent but when the system in the end n
Re: (Score:3)
Re: (Score:3, Informative)
>embrace-extend-extinguish
it's MIT license, you are free to make your own fork: https://github.com/Microsoft/GVFS
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But the Windows tree includes cool art media like Dancing Pigs screen savers, right?
Re: (Score:2)
You do realize that Linux is just the kernel right? Windows includes a kernel plus a whole lot more like a GUI and userspace. Sheesh people think about it. The closest there is in the free software world would be the BSDs where they include not only the kernel but the userspace....minus the GUI. So now imagine taking the Linux kernel, all the bintools, X and a single desktop (Gnome or KDE) and placing all of that into a single repo.
Re: (Score:2)
Not more complex, just mismanaged.
Take a typical linux distro. Hundreds of packages, each has their own repository. Many of them are git, but not all. Only one of those is the kernel, which is what people keep trying to compare.
In MS, they have a single repo, for everytihng. This is crazy. Even in my org which has way less complexity, we have several git repositories because we know, for example, that work on one portion is not related to the other. Sure, there are cross-functional people that might t
Re: (Score:2)
So Windows is so much more complex than Linux, that it cannot be handled by vanilla Git like the other OS? I was thinking the opposite, unlike Windows Linux includes code for almost all the existing hardware platforms out there, and all the hardware drivers already in the kernel, just to name something.
Total misunderstanding what Microsoft is doing. They have a repository of 300 GB. You know what happens if you have a 300 GB repository and type in "git clone"? A 300 GB download starts. It doesn't matter what operating system, a 300 GB download takes time.
While git can handle this all without problems, it takes time. What Microsoft has done is added a virtual file system on top. When you clone the repository with that virtual file system, all that gets copied is the directory structure and the hashes (a
Re: (Score:2)
There's nothing EEE about that, unless you call changing some code that didn't suit your needs EEE. In which case you could criticise every fork ever for the same policy.
Re: (Score:3)
If you read TFA you'd have noticed that Microsoft isn't using Vanilla Git.
Of course they haven't, Git can't handle repos that large particularly well (in fact very few DVCSs would be able to).
Using their normal embrace-extend-extinguish mindset
Extinguish what? The version control system they themselves have just moved to? That wouldn't really make much sense now would it?
they've created their own GVFS (Git Virtual File System) and forks of Git server and client that only work in a GVFS-enabled ecosystem
Where did you get the impression that their forks only work with GVFS? Understandably the client and server would need to be GVFS-aware but that doesn't preclude them from being used without that virtual file system and in fact it would make much more sense for th
Re: Linus Wins Again (Score:2)
Re: (Score:2)
Re: Linus Wins Again (Score:2)
Re: (Score:2)
All Linux filesystems are abstracted through a VFS in the kernel already.
The same on Windows with the IFS, that isn't what GVFS does though.
Re: (Score:2)
Re: (Score:2)
inaccurate (Score:5, Informative)
You obviously didn't RTFA. They had to create this GVFS thing because their code base is huge and they don't want to sync hundreds of gigs between remote locations. Also they were not using VSS before switching to Git, they were using Perforce.
It's not a WTF. It's a great achievement and will probably become a standard component of large-scale git repos. If you ever had to deal with huge repos that are used by teams in many timezones you'd understand that.
For reference, the Linux kernel git repo is about 6GB all in. The Windows git repo is 300GB. We can all guess that in that 300GB there's a fair amount of dead wood but still, in an era where storage is dirt cheap, one shouldn't have to trim down a code source repo because the vcs can't keep up.
Re: (Score:2)
in an era where storage is dirt cheap, one shouldn't have to trim down a code source repo because the vcs can't keep up.
Cold storage is dirt cheap, not active/hot storage.
If your codebase is somehow 300GB of code..... Imagine what kind of attack surface that represents. This kind of size is about insane.....
Re: (Score:2)
If your codebase is somehow 300GB of code..... Imagine what kind of attack surface that represents. This kind of size is about insane.....
Heh. Google's source repository [acm.org] was 86 PB (yes, that's 86,000,000 GB), in 2015. It's bigger now. Google uses Perforce, BTW, the thing that MS migrated away from because it ostensibly couldn't handle their puny 300 GB repo.
It should be noted that Google's source repository contains more than just source code. In particular, it contains all of the build tools, compilers, libraries, etc. so when you check out a given revision of some project and build it, you build it with the tools used to build it original
Re: (Score:2)
Google *used* Perforce. Past tense. Your own link details the custom system they built to replace it.
Re: (Score:2)
Google *used* Perforce. Past tense. Your own link details the custom system they built to replace it.
That's true now, and in 2015, but the commercial product was used until 2013 or so and when this paper [perforce.com] was published in 2011, the Perforce metadata had exceeded 1TB. The paper doesn't give the total data size, but it certainly had to be in the region of tens of petabytes even then.
The reason I didn't mention Piper (the in-house Perforce replacement) was because it seemed like unnecessarily complication. Google managed a Perforce repository approximately four orders of magnitude large than Microsoft before
Re: (Score:2)
Go fix an Android security hole swillden. There are plenty. You should be busier than you are.
I'm quite productive, thank you.
Re: (Score:2)
Cold storage is dirt cheap, not active/hot storage.
The cost for high-end enterprise storage is around $2,500 per TB according to Gartner. For commodity hardware it's around $30 per TB.
This means that if Microsoft was to shrink their git repo by 1/3, they would save $275 for the gold copy stored on their SAN, and about $3 per developer cloning it.
I don't know where you live but in my book, that's dirt cheap.
Re: (Score:2)
People are forgetting the most important part. Microsoft is actively trying to get some of these capabilities back into mainstream Git. They're actually, gasp!, giving back!
Re: (Score:2)
Of course, if your repo is 300GB, you've done something wrong. That's not to say that there sholudn't be 300GB of resources going into a product, but having a *single* git repo encompass your entire product may be a bad call. It's not just whether the git software can keep up, I can guarantee you people struggle to keep up with the activity going on in the repo, when they really only care about more well defined subsets.
It is a WTF in managing a product.
Re:inaccurate (Score:4, Interesting)
I won't spend time searching for you, but there are blog posts from softies describing the various attempts over the years to segregate code, and this is the result of that failure.
The evolution of windows from a DOS based illusion to a full client server model on a single computer resulted in a lot of bad decisions. Yes I'm aware of os2 and the parentage of XP via NT, but certain allowances were made so that Windows 98 and ME software would work on XP.
There was no management and grand plan. Reading Charles Petzold and Raymond Chen make that clear. The effort continues. But this is a normal software company trying to ship product, not meet the ivory tower ideal. Hardly a defense, but the info is all out there, mostly documented in this switch to git. Mark Russinovich, SysInternals, Alex Ionescu have the unofficial story, in a sense. How we got here is clear if you know the history.
Re: (Score:2)
Of course, if your repo is 300GB, you've done something wrong.
Maybe what you've done wrong is being one of the most successful software companies for the last forty years, and writing massive amounts of software. Well, maybe there has been one or the other video file check in at some point. That's actually a very good point for this: If some idiot checked a 2 Gigabyte video into your git repository, everyone doing a "git clone" has to download that video! And it can be really hard to remove it. With this file system, you wouldn't care (much).
Re: (Score:2)
That success can *easily* be in spite of poor code management practices, or even if the code quality isn't that great.
If they are having to spend a lot of time trying to change git to their use case, that means they are at *least* spending money on that. Given the sort of scenario that would drive these changes, I imagine the code workflow at microsoft is probably hideous. You can still get work done under those conditions, but probably in a really inefficient way.
Re: (Score:2)
It is a WTF in managing a product.
You don't like it, I get that, but that doesn't mean it's wrong. Every organization has different challenges and limitations, and whatever solution they find to solve their problems and sell their product is an achievement. This is not academia, this is real life with real dollars and real computers.
Re: (Score:2)
So the point can be made that if it's something like a bunch of video files, it's weird, but could be a fair complaint about the VCS. Of course the VCS cannot meaningfully provide fine grained difference data about non-textual, so having the process have links to some more media appropriate storage would make sense.
The other half is the implication that of the presumably massively far and wide code is in a single repository. A repository makes sense if people who access are *roughly* equally likely to acc
Re: (Score:2)
Re: (Score:2)
It's all in the article. Their previous system was split in 65 repos, but they wanted to have all Windows in the same repo to be in line with their new engineering strategy.
They had 3 problems:
1 - performance for cloning, which they fixed with that GVFS thing.
2 - git processing files that were not modified (they changed the way it works to improve that)
3 - slower and slower performance when many files are touched but not modified on the desktop (they also changed the way it works to improve that)
Re: (Score:2, Insightful)
But no, he had to write an operating system besides.
Actually, he wrote a kernel...
Re: Linus Wins Again (Score:2, Funny)
We know already Richard, we know. Go finish Hurd, then maybe you can stop being so bitter.
Re:Linus Wins Again (Score:5, Interesting)
We have Linus' famous picture on posters and t-shirts at my work. (NVIDIA)
My boss was sitting just a few feet beyond the camera when Linus had his rant too. It was epic, many of the Nvidians were squirming in their seat.
Re: (Score:2)
Well much of Linus' criticism was against the ARM SoC group at NVIDIA. And over the last few years they've really took upstreaming seriously and have been doing a good job at using upstream frameworks instead of rolling everything themselves. (I'm biased obviously). You can boot an upstream kernel on most (all?) of the Tegra development kits, which was frustratingly not the case a few years ago.
As for NVIDIA's desktop GPU drivers, many of the key open source people accept that there isn't going to be an ope
Re: (Score:2)
Git is a piece of shit revision control system.
"revision control system" is not a category, it's a specific utility. Learn the meaning of words, then you can come back and throw in dirty language if you still think that makes you look cool and savvy.
Re: (Score:2)
The problem with git is that branches don't have proper history. There is the reflog but it is disabled by default on bare repos (main project repos are usually bare), can't be accessed remotely and is generally intended as more of a disaster recovery feature than a long term history feature.
Commits have history but that history doesn't tell you when the commit was pushed to the main project repo, it doesn't tell you what branch the commit was created on, it doesn't tell you who promoted the code from a tes
Time to let go (Score:4, Insightful)
If you try to make Git work like Subversion, you're doing it wrong. Stick with Subversion (or cvs for that matter) if that's what makes you comfortable and if you want to obsess about stuff like branches history. Otherwise read a good tutorial and pick a mainstream branching strategy such as Git flow.
Git branches are fantastic. They make life easier by allowing you to focus on the code without having to deal with side effects of Subversion-style branches, such as broken paths in config files. As for directory renames, if you use Git properly there's no problem.
Re: (Score:2)
I know exactly how git branches work thanks. Directory renames are not supported in merges, even if you use git 'properly', which I am doing as it happens. I also know how git flow works, and I think it's highly overcomplicated and best avoided. Branches increase development time, since every commit to any branch has a cost associated with it that multiplies by the square of the number of other branches, if those branches are eventually to be merged together. That's just the reality of coding with large tea
Re: (Score:2)
Re: Linus Wins Again (Score:2)
Re: (Score:2)
Well, Mr Kelvin, since I wasn't the one who moved the directory, but just the one that attempted the merge, I guess I must be.
I thought git was supposed to do rename tracking all by itself? And also, that it didn't track directories, just files. And in any case, I tested it, and a merged-in directory rename does not move newly added files to that directory, so you're basically wrong. Oh well.
I know perfectly well how git works, thank you very much, and there's nothing in the way in which it's designed that
Re: Linus Wins Again (Score:2)
Re: (Score:2)
Oh Mr Kelvin, you doth protest too much.
Look mate. If you merge in a directory rename, files that you've added to that moved directory prior to the merge, are not moved. That's the beginning and the end of it, sorry. Fine, I get that git doesn't do that, because it doesn't really believe in directories. no amount of typing 'git status', 'git add', is going to make any difference.
Just because hundreds of thousands of people do something daily, doesn't make it any bloody good. I mean, do you have any idea how
Re: Linus Wins Again (Score:2)
Re: (Score:2)
git add what, Lord Kelvin?
Re: (Score:2)
Re: (Score:2)
Doctor Kelvin, I must apologise for having riled you to the extent that you called me a dumbfuck, but here's what I'm talking about.
git init
mkdir A
touch A/file.txt
git add A/file.txt
git commit -m 'a'
git checkout -b Branch
git mv A B
git commit -m 'b'
git checkout master
touch A/file2.txt
git add A/file2.txt
git commit -m 'c'
git merge Branch
ls
And lo and behold, the added file is still in the old directory, even though the merged branch had re-named the directory. How about that? I read y
Re: (Score:2)
I have no idea what you are talking about. When I follow the steps you listed I get a master branch with two directories, A and B. A contains file2.txt and B contains file.txt. And the rude person is you, who feels it was rude for me to expect you to ask a question in an intelligent way rather than wasting our time.
Re: (Score:2)
But I renamed the directory 'B' in my branch. So, for instance, the branch decided to move an entire library into another directory. However, my merge did not apply that directory rename to the existing tree, which is what a directory rename should really do, but instead completely ignored it, and only added the (now re-named) file. At the very least, this should generate a conflict. Maybe, say, a tree conflict. Ha. See what I did there?
Anyway, the outcome isn't what I wanted, and is what the text "If you m
Re:Linus Wins Again (Score:4, Interesting)
Re: Linus Wins Again (Score:2)
What were they using before? (Score:2)
Re: (Score:3)
Source Depot... which is a modified version of Perforce.
Re: (Score:2)
I thought they had an "eat your own dog food" policy, so I'm going to guess Team Foundation Server.
Re: (Score:2)
It's in the article.
At the time, Microsoft was using SourceDepot, a customized version of the commercial Perforce version control system, for all its major projects.
Re: (Score:2)
Does that require or forbid use of the Volume Shadow Service on the drives it resides on?
Re: What were they using before? (Score:4, Insightful)
Get real. Windows has owned the market for 25+ years already and the Western Civilization has become consistently more computerized during that period, which definitely didn't happen because of OpenVMS or QNX or some other wonder of software engineering.
Besides John Deere or Tupperware, not a lot of products have enjoyed such stability.
Re: (Score:2)
Re: (Score:2)
The web began to have some kind of economical relevance around or after the first dot-com bubble. The first mainstream version of Windows came out in 1989. And already at the time Microsoft had done a lot to promote end-user computing (i.e. PC).
It's okay to dislike Windows but don't rewrite history, even if you feel the need to mention iPhones.
VSS ? (Score:2)
Re:VSS ? (Score:5, Funny)
When did they stop using Visual Source Safe ?
Microsoft Visual SourceSafe was first released in 1994, so by my estimate they stopped using it in 1994.
Re: (Score:2)
Microsoft Visual SourceSafe was first released in 1994, so by my estimate they stopped using it in 1994.
Nonsense, they continue to use it to this day. It plays a prominent role in their initiation/hazing of new interns.
Re: (Score:2)
Truth be told, Microsoft didn't create SourceSafe. They bought it. In some cases, like SQL Server or Visio, their acquisitions have been a success, but more often than not (such as SourceSafe or Dynamics AX) they have gone downhill fast.
Re: (Score:2)
I don't think many people have been using VSS for a long time. It was replaced by TFVC (Team Foundation Version Control) back in I think about 2005 as part of TFS (Team Foundation Server).
I don't think they ever used this for the Windows source code though. I think they were using Perforce or something.
Re: (Score:2)
Note that while I am not going to defend VSS, while it is fun to poke at a company for not eating their own dog food, in this case it *could* make sense.
What is appropriate for managing a project as complex as an entire operating system is not necessarily what is appropriate for 99% of their customers who want to manage code in a project.
Just like the way MS runs Azure probably looks nothing like what they have customers run.
Something's wrong (Score:5, Funny)
Re: (Score:2)
That's funny. When I tried that, it said copyright 1980 Gary Kildall and Digital Research... The repo must be corrupt...
https://en.wikipedia.org/wiki/... [wikipedia.org]
https://en.wikipedia.org/wiki/... [wikipedia.org]
"Kildall obtained a copy of PC DOS, examined it, and concluded that it infringed on CP/M. When he asked Gerry Davis what legal options were available, Davis told him that intellectual property law for software was not clear enough to sue.[12] "
Re: (Score:2)
Cool stuff (Score:5, Interesting)
The work they've done to make Git scale to fit their needs sounds great, and I see they've open-sourced the key components. That's awesome. At the moment it looks like GVFS is Windows-only (not a big surprise -- and not a complaint; they built what they needed). I'd like to see someone port it to Linux and make this infrastructure more broadly available. It sounds like it would be much nicer to work on than the "repo" tool that Android layers on top of Git to enable managing a whole bunch of smaller repositories.
Re: (Score:2)
Microsoft released gitfs as opensource. It's there for anyone to patch.
My guess is MS hatred is why in the opensource community
Re: (Score:3)
Jakub's Mastering Git book [packtpub.com] discusses briefly that git is less a version control system in itself and more a tool for building version control systems.
Alternative user interfaces like Zit, Cogit and Yap show that there is some merit to this view.
Git's content-addressable data store with locally computable global identifiers can form the basis of a generic storage engine. Microsoft has created what appears to be another file system out of git. There are many other filesystem implementations [kernel.org].
The git wrap
Re: (Score:2)
Re: (Score:2)
The work they've done to make Git scale to fit their needs sounds great, and I see they've open-sourced the key components. That's awesome. At the moment it looks like GVFS is Windows-only (not a big surprise -- and not a complaint; they built what they needed). I'd like to see someone port it to Linux and make this infrastructure more broadly available. It sounds like it would be much nicer to work on than the "repo" tool that Android layers on top of Git to enable managing a whole bunch of smaller repositories.
Why? The only reason to use their GVFS is to (a) work with VS and TFS - both of which are Windows-only (no, VS Code doesn't count, it doesn't have 95% of the features of VS), and (b) to use their broken development model of locking files when you're working on them. Neither of those is desirable to anyone using git or who understands proper VCS systems.
Huh? I see absolutely nothing related to locking in GVFS. The GVFS protocol [github.com] has no mechanisms for acquiring or releasing locks, and locking isn't mentioned anywhere else in the documentation. From everything I can see in the code, issues, documentation and TFA, GVFS just provides lazy fetching of git objects, which makes it possible to "check out" a large repository without having to wait for everything to download, and also optimizes the stat'ing of files that git does so much of.
Wait, what? (Score:2)
I really, really hope those 1900 unreviewed pushes are all developers just wanting to make sure their code is backed up and are pushes to private branches.
Re: (Score:2)
Review system could easily cover multiple pushes per review, the git based one we use certainly can and it's not terribly sophisticated otherwise.
Where's the link to the repo? (Score:2)
nt
1900 unreviewed commits per day? (Score:2)
with 8,500 code pushes made per day and 6,600 code reviews each day
Dang. We can't get away with that where I work.
Re: (Score:2)
Is Source Depot better than Clear Case?
Re: (Score:3)
Not using any revision control system, and instead just making copies of files before you change them and manually labeling them foo.v1.1.c and the like, is better than ClearCase.
Re: (Score:2)
It does. The original statement was that "anything is better than ClearCase". Someone named two other really crappy VCSs, implying that these two were even worse than, or at least no better than ClearCase, hoping to disprove the OP's statement. I didn't address those specifically, but I went for the most extreme example I could think of which was no real version control at all except doing it manually with version numbers in the file names, and claiming that even that (which surely is clearly worse than
Re: (Score:3)
"SourceSafe" is even funnier than "Microsoft Works"
Re: (Score:2)
Why are they allowing the use of Source Depot as a way to eventually check in to Git?
It would seem better to draw a line in the sand and say "beyond this point, we are all using Git... though for historical reasons the old repos will be kept around for reference and SE"?
Re:Distributed Hg. (Score:5, Informative)
How come they didn't go with Mercurial?
Some obvious reasons off the top of my head:
You can also make the argument that Git was designed from the beginning to be suitable for developing an operating system. Or, put more bluntly, it was designed to be used by programmers who are smart enough to work on an operating system, Yes, the Mercurial CLI is generally easier to come to grips with, but that isn't a compelling enough reason on its own.
Keep in mind also that the overall direction of Mercurial is increasingly being driven by the needs of Facebook's dev teams. Which is great to see, in the sense that they're returning their enhancements to the community..... but by and large they're building web properties, not operating systems, so the priorities may be different.
Re: (Score:2, Funny)
git's interface wasn't "designed" at all, and it shows badly. That's why there's almost one tutorial on the web per person who figured out how to use it.
The article summary also leaves out the minor point that MS had to write an entire abstraction layer underneath Git because it's so incapable of handling a large repository. And yes, there are actually good reasons to have a "monolithic" repository. Just because your favorite version control system can't do something doesn't mean it's a bad idea.
Re:Distributed Hg. (Score:5, Insightful)
The article summary also leaves out the minor point that MS had to write an entire abstraction layer underneath Git because it's so incapable of handling a large repository.
Not completely true. They call it GVFS, but all it really does is prevent the entire repository from being downloaded when you clone it. Instead it downloads "only what you need". And there are a couple of patches to make git aware that this is happening, so that it stats only what is local and not the whole repository. One might argue that since the developer teams are not working on the entire codebase at once but rather on, let's call them "modules", within the larger repository, then the repository itself should be made more modular that match this development pattern. That would be more inline with the way Git was designed in the first place, and these extensions would not be as necessary. Still, to have the capability is nice.
Just because your favorite version control system can't do something doesn't mean it's a bad idea.
No, but pick the right tool for the job. If you are not developing modular, self-contained code in a decentralized fashion, don't use a source control system designed with those explicit goals in mind.
Notice how LibreOffice splits up their fairly large codebase into several smaller repositories,
https://github.com/LibreOffice [github.com]
Seems to work pretty well for them.
Re: (Score:2)
I may add getvfs is opensource and Microsoft provided the patch. It's up to the Git team to include it.
Re: Distributed Hg. (Score:2)
Re: Distributed Hg. (Score:2)
Re: (Score:2)
Seems easier than trying to manage independent source repos for dependencies.
If that's the only reason, then that's pretty silly. A decent dependency-management system is the right solution.
Let's say package A depends on packages B and C. B makes an incompatible change that breaks A, so A has to know that only the previous version of B will work without a patch. Patch goes in to A, and now new version of A works fine with new version of B. Meanwhile, a new feature in A requires an update to C, so C is updated and A development continues. A build/runtime-dependency system should be a