by serverlessmom on 5/24/24, 2:54 PM with 51 comments
by dboreham on 5/24/24, 3:42 PM
"there was often a way to run the whole system, or a facsimile, on your laptop with minimal friction"
Any time the code you're developing can't be run without some special magic computer (that you don't have right now), special magic database (that isn't available right now), special magic workload (production load, yeah...), I predict the outcome is going to be crappy one way or another.
by MeetingsBrowser on 5/24/24, 3:34 PM
I haven't heard of this. I thought shifting left was the new hotness. I was under the impression that finding and fix mistakes as soon as possible is considered best practice.
> Shifting Testing Left Is a Return of an Older, Better System
I don't really understand. I thought the "old way" was to wait until everything was done to check if it works, and the "new way" was to add tests for individual components as you go, making sure nothing breaks as new features are added.
by taneq on 5/24/24, 3:36 PM
by Kinrany on 5/24/24, 3:23 PM
by kelsey98765431 on 5/24/24, 4:01 PM
The question is when to test.
There is benefit and tradeoff when you decide when and how to test your application. Sometimes you simply cannot test beforehand, such as when you are integrating with a live system with a cost of replicating the environment very very high due to this being a managed solution contract with a large cough IBM cough vendor. In these cases often the question is when to mock test. Then when to test. But this is still just how and when.
With a solo developer, it makes plenty of sense to write the feature then test it. Writing it may be the way to actually discover the behavior you want. With a team this can still be the case. In a larger team you can pass this to QA engineers to take these rapid prototypes and find the issues with them for a different type of developer to fix bugs in.
The reverse is the same. You can discover your feature behavior by experimenting with tests until the desired behavior is asserted to be occuring through things like type validation and object patching for instrumentation, or whatever profiling or debugging tools and test suites you like.
The hard answer is there is no one answer. Just like iterative development urges us to do our best to keep up by iterating on our features, we should iterate on our tests. Do not worship coverage, use it as a tool. Do not fear testing, it's just code afterall so you can actually make a full application that is just a series of tests (experiments) that run and if everything has passed, well you just wrote the same thing as a bunch of conditional checks in your library code.
Do what works for you, your team, the project you are on and where you are at. If youre struggling add some more tests. If you have so many tests failing you dont know what to do, write some more code and string your tests together with new functions. It gives a nice way to say ok, this is the next step of my iteration. Some like to start by testing some like to finish. I do both. Whatever you do, put love into it and if thats super sturdy prod code that has no tests and will never fail because you are the safest data handler in the world with the most flexible business logic in the world, go for it - but just remember if you someday later forget how things work and dont want to read documentation or experiment with the live system - you can always write tests to remind you how it works.
Tests are a form of documentation. They may not prove what you think they prove, but they document a fact about your code that is objective as determined by the code itself. This is a wonderful power, use it however you wish.
by moomin on 5/24/24, 4:12 PM
And if all the work you're doing is well understood before you start, I suggest finding somewhere your manager you with the hard stuff.
by dpflan on 5/24/24, 3:31 PM
by dpflan on 5/24/24, 3:32 PM
by jf22 on 5/24/24, 5:30 PM
I think we overuse the term waterfall. Having a QA phase as part of a development cycle doesn't not make something "waterfall.:
by ssfrr on 5/24/24, 4:50 PM