by sams99 on 3/31/19, 11:56 PM with 114 comments
by jacobsenscott on 4/1/19, 4:09 AM
Also you are a hero because you catch and fix production problems nobody else sees or understands because production is running linux.
If you have a problem on OSX you just get 500 people telling you to reset your pram, and really you just need to wait for the next osx update. You certainly can't rollback an osx update. Nobody knows whats wrong because nobody has access to the code. Even if you did - each update is millions of lines of code. It is crazy town.
Same for windows except there is not pram to reset. You just reboot until the next update I guess.
by mastazi on 4/1/19, 3:28 AM
The article says that you pay a "VM tax" by running in a Linux VM in Windows or by using WSL, but by using Linux directly you end up "paying" in other areas, for example in terms of less-than-perfect drivers (which leads to problems such as worse battery life, worse graphical performance etc). In an ideal world, laptop and GPU makers would put as much effort developing Linux drivers as they do when they create Windows drivers, but unfortunately this is not the case.
Another point is that, even if you are running Linux on your dev machine, it is often challenging to have a local environment that is as close as possible to your production one; for example where I work we have Ubuntu 16.04, Ubuntu 18.04 and CentOS (and with various combinations of installed packages), even if I was using a Linux laptop, I would still need containers to get all the different environments right. The only case where you could use no containers at all is if you had all of your servers running the exactly the same environment.
Finally there is the issue of some pieces of software not being available for Linux, 95% of the times you can find a Linux equivalent that works for you but, in my experience, there were still rare cases where I had to resort to Wine or some Win VM just to run some specific tool that was needed to do my job.
PS My job, at the time I was using Linux as my main desktop OS, and during the couple of years after I switched back to Windows, was about PHP, Python and Java web apps deployed on Linux servers, so my comments above should be taken in this context. Maybe in other fields of software development there are different factors to take into account.
PPS I have also tried MacOS, on the one hand it's great that Mac is UNIX, but on the other hand it's not Linux so, if you want to closely replicate your production Linux environment, you will still need VMs or containers.
by nickjj on 4/1/19, 1:01 PM
I've run 100,000+ line Rails apps in WSL (well technically through Docker, which I connect to through WSL) and I never noticed a slowdown that was bad enough to make me think "this sucks". It's always been pretty good. I run all sorts of Rails, Flask, Phoenix and Webpack driven apps and all of them run fast enough where I don't think twice about it.
Personally, I find the WSL set up somewhat close to native Linux in terms of the user experience. I'm not talking about I/O performance, but I mean how it feels to use the OS in general.
For example:
I spend 99% of my time in a WSL terminal using tmux + terminal Vim + ranger. So that takes care of coding and managing files.
Then I use a browser and other graphical apps (image / video editors) that run directly in Windows.
Dexpot sets up virtual screens with key binds that are comparable to how i3wm lets you switch and move windows to another screen.
Keypirinha lets you launch apps with a fuzzy finder (like dmenu but better IMO)
AutoHotkey lets you easily manage global hotkeys (just like i3 does) and more
When you put all of that together, you get a really really good development experience that also doubles for gaming and running programs that don't have a good alternative on Linux (such as Camtasia on Windows).
Then for the icing on the cake, since you're running Ubuntu 18.04 in WSL, you can provision WSL itself with the same exact configuration management scripts you would use to provision a production box. For me specifically I run all of the same Ansible roles against WSL that I do in production. I can set the whole thing up with 1 Ansible command. Plus my dotfiles also happen to work exactly how they do on my native Linux laptop so it's easy to keep things in sync and feeling the same.
This all runs from a i5 3.2ghz / 16GB of RAM / SSD / etc. $750 desktop built 5 years ago.
Even if Apple tax didn't exist I would still use this Windows / WSL set up if I weren't in a position where I could run Linux natively.
by StillBored on 4/1/19, 3:14 AM
AKA, VMs are slower than bare metal. News at 10.
Sure there are things windows tends to be slower at, but similarly there are things linux tends to be slower at. For general desktop usage I think you will find two things, windows does quite well, and desktop level virtualization tends to be slower than server virtualization because most desktop users will be using a virtio/emulation IO access method vs servers which tend to be punching PCIe adapters into the VMs. Either way, the overhead of additional translation layers and VM exits for various things is always going to be worse than just running on bare metal. Wether that works out to barely noticeable for compute heavy benchmarks that are TLB friendly, 2x for small packet edge case IO's, or somewhere in between is completely application dependent.
by pixelmonkey on 4/1/19, 5:03 AM
I wrote about this in the context of Lenovo X1C laptop here:
http://amontalenti.com/2017/09/01/lenovo-linux
I also recently built a Linux desktop from ~$950 of commodity parts (including a pair of free GPUs a crypto friend donated to me after his startup died). I use it as a devserver and a GPU rig for playing with CUDA, PyTorch, and TensorFlow.
Aside from the requirement of proprietary Nvidia drivers, this whole box works perfectly in Linux too, is blazing fast, and operates silently and with low power consumption. (The case itself is bulky, but it’s stationary.) I think an equivalent Mac would cost a 2-3x multiple and run less well for developer workloads.
https://pcpartpicker.com/user/amontalenti/saved/r4rLJx
Say what you will about Linux, but if you choose your hardware carefully, it truly does “just work” these days. And you can’t beat having access to scriptable everything and source code everywhere. That said, I keep a Mac Mini around because there is some proprietary stuff you can’t avoid in the Apple ecosystem (e.g. XCode for iOS, Safari Debugger, Keynote/Pages, ...)
by underwater on 4/1/19, 3:37 AM
In my experience any multi-minute operation invites a context switch or coffee break, and end up being roughly equivalent in practice.
by snarfy on 4/1/19, 12:48 PM
There's too much fiddling around to get it working right as a desktop. Audio has always been a ghetto. Wifi drivers are still binary blobs. Fonts, hidpi, multi monitor support, wayland, systemd are all still issues today. I really don't want my development machine to match production. Production is a stripped down image for a reason. I spend far more time writing code than running it, so the VM performance hit is a moot point.
To be fair, Windows 10 has plenty of issues of its own that need tweaking, like having the only way to disable cortana is through group policy edits. And then they reenable it on the next update, and give me some candy crush ads in my start menu. It's pretty infuriating, but still less so than manually fixing wifi drivers through USB boot drives.
by NightlyDev on 4/1/19, 3:43 AM
The best part is the performance and a realistic bare metal environment. You can profile code and actually trust the results. You can't say the same for virtualizarion, containers or other platforms.
by robocat on 4/1/19, 4:06 AM
"I was paying a pretty high tax for deciding to stick with Windows” was likely "I was paying a pretty high tax for deciding to use a VM”.
I have little love for Windows, but I would prefer to see numbers that place the blame correctly.
by ryan-allen on 4/1/19, 3:46 AM
I think this is the point, given you can get an inexpensive desktop CPU with 6 cores and 12 threads, it wouldn't matter so much if you had a 20% perf tax with WSL when you have approx 12x the throughput.
You could always offload full test suites to an external CI box upon check-in and work with a subset of unit tests locally, once the suite becomes prohibitive for a single machine.
by bluedino on 4/1/19, 3:24 PM
It's much better than it was in the PowerPC days, when (IMO) OS X was almost too slow to be usable. We have it good these days with faster graphics and 12-core CPU's and super fast SSD's...
But you give these same advantages to Windows/Linux and OS X is just handily beaten in just about every benchmark you can think of.
by impostir on 4/1/19, 5:57 AM
by rcarmo on 4/1/19, 6:32 AM
Also, I get it that some kinds of testing “feel better” when done locally, but... why not have those run on a cloud VM?
by pmontra on 4/1/19, 6:02 AM
Still, AFAIK VirtualBox has a problem with supporting multiple cores. Single core VM is heavily handicapped even for Ruby. Parallel testing was not in Rails but it's definitely a thing.
by ncmncm on 4/1/19, 4:21 AM
Surely someone, somewhere has explained that Windows runs faster in a VM than on bare metal, because Linux is better at file systems and buffer management? Or that relying on drivers from all over makes your system less reliable than the fixed set used in the VM?
Me, I cannot imagine running MSWindows on bare metal. It just feels wrong.
by amanzi on 4/1/19, 3:55 AM
by evancox100 on 4/1/19, 3:51 AM
by mwcampbell on 4/1/19, 3:22 AM
by craz8 on 4/1/19, 4:13 AM
No need to wait for Rails 6 to get this boost
by lazyjones on 4/1/19, 4:06 AM
by xvilka on 4/1/19, 2:56 AM
by 0x8BADF00D on 4/1/19, 3:25 AM
The haters he’s replying to are literally splitting hairs.
Wow, they shave off 3 mins to run an interpreted language test suite. This is the ABSOLUTE state of the industry right now.
by bgdnyxbjx on 4/1/19, 5:41 AM