At Spectrum Labs, our deep learning models are trained and validated against large data sets that can only be processed in a distributed manner. Data Scientists require the ability to invoke batch jobs as part of the the model development lifecycle, and these jobs must run in a scalable, fault-tolerant manner. Argo Workflows is chosen as our data pipeline framework as it provides a container-...
Priority access to all content
Video hallway track
Community chat
Exclusive promotions and giveaways