by almosthere on 2/7/25, 6:46 PM
Having used spark for the past 8 years or so, it's definitely a solid basic for data engineering. I use it for generating reports the most, but sometimes we have large projects to get data into different staging databases. I use it a lot with ElasticSearch or a parquet. Basically it helps you write large joins and flatten the result to a database that can more quickly perform aggregations on that flattened result (like Elasticsearch) or a columnar database.
by datadrivenangel on 2/7/25, 2:57 PM
If you have experience in any data frame library (like Pandas), and SQL, you can pick up PySpark pretty easily... With the one caveat that writing good data pipelines in any language gets much harder when you start looking at ways to actually processes big data (~20+TB). Modern SQL engines are so good though.
by philomath_mn on 2/5/25, 3:07 PM
by francocalvo on 2/5/25, 9:24 PM
I'm a Data Engineer which uses Spark daily. I guess the only important cert would come from Databricks, but I think it will be more worth your while to read the book mentioned here and try to do a little project ingesting/transforming data
by hnthrowaway0315 on 2/5/25, 6:22 PM
Just get a job since you are already senior. You can learn it on the job. Find a few tutorials if you must, but people should be able to pick it up in a few weeks for basic work.
by rookie123 on 2/5/25, 12:33 PM
Bump!