I'm not really sure the exact economics of this, because I don't run these companies. But I know what companies have told me: "we don't hire junior developers because we can't afford to have our senior developers mentor them." I've seen the rates for senior developers because I am one and I had project managers that had me allocate time for budgeting purposes. I know the rate is anywhere from $190-$300 an hour. That's what companies believe they are losing on junior devs.
One company involved in the test told the Wall Street Journal that for over 80 percent of the 911 calls where Googl's system was used, the tech giant's location data were more accurate than what wireless carriers provided. The company, RapidSOS, also said that while carrier data location estimates had, on average, a radius of around 522 feet, Google's data gave estimates with radii around 121 feet. Google's data also arrived more quickly than carrier data typically did.
The venture-capital (VC) industry is booming. American visitors return from Beijing, Hangzhou and Shenzhen blown away by the entrepreneurial work ethic. Last year the government decreed that China would lead globally in artificial intelligence (AI) by 2030. The plan covers a startlingly vast range of activities, including developing smart cities and autonomous cars and setting global tech standards. Like Japanese industry in the 1960s, private Chinese firms take this "administrative guidance" seriously.
1. Computer retailers stopped installing development environments by default. As a result, anyone learning to program has to start by installing an SDE -- and that's a bigger barrier than you might expect. Many users have never installed anything, don't know how to, or might not be allowed to. Installing software is easier now than it used to be, but it is still error prone and can be frustrating. If someone just wants to learn to program, they shouldn't have to learn system administration first.
2. User interfaces shifted from command-line interfaces (CLIs) to graphical user interfaces (GUIs). GUIs are generally easier to use, but they hide information from users about what's really happening. When users really don't need to know, hiding information can be a good thing. The problem is that GUIs hide a lot of information programmers need to know. So when a user decides to become a programmer, they are suddenly confronted with all the information that's been hidden from them. If someone just wants to learn to program, they shouldn't have to learn operating system concepts first.
3. Cloud computing has taken information hiding to a whole new level. People using web applications often have only a vague idea of where their data is stored and what applications they can use to access it. Many users, especially on mobile devices, don't distinguish between operating systems, applications, web browsers, and web applications. When they upload and download data, they are often confused about where is it coming from and where it is going. When they install something, they are often confused about what is being installed where. For someone who grew up with a Commodore 64, learning to program was hard enough. For someone growing up with a cloud-connected mobile device, it is much harder. theodp continues: So, with the Feds budgeting $200 million a year for K-12 CS at the behest of U.S. tech leaders, can't the tech giants at least put a BASIC on every phone/tablet/laptop for kids?
NBC News' partners at Neo4j have put together a "get started" guide to help you explore the database of Russian tweets. "To recreate a link to an individual tweet found in the spreadsheet, replace 'user_key' in https://twitter.com/user_key/status/tweet_id with the screenname from the 'user_key' field and 'tweet_id' with the number in the 'tweet_id' field," reports NBC News. "Following the links will lead to a suspended page on Twitter. But some copies of the tweets as they originally appeared, including images, can be found by entering the links on webcaches like the Internet Archive's Wayback Machine and archive.is."
Here is the question: Could Linux ever be made to become fully compatible with all Windows and Mac software? What I mean is a Linux distro that lets you successfully install/run/play just about anything significant that says "for Windows 10" or "for OSX" under Linux, without any sort of configuring or crazy emulation orgies being needed? Macs and PCs run on the exact same Intel/AMD/Nvidia hardware as Linux. Same mobos, same CPUs and GPUs, same RAM and storage devices. Could Linux ever be made to behave sufficiently like those two OSs so that a computer buyer could "go Linux" without any negative consequences like not being able to run essential Windows/Mac software at all? Or is Linux being able to behave like Windows and OSX simply not technically doable because Windows and OSX are just too damn complex to mimic successfully?