TL;DR| Velox Blog
Background| Velox Blog
At the end of the [previous| Velox Blog
In this article, we will discuss how a distributed compute engine executes a| Velox Blog
This is the first part of a series of short articles that will take you through| Velox Blog
When Velox was open sourced in August 2021, it was not nearly as easily usable and portable as it is today. In order for Velox to become the unified execution engine blurring the boundaries for data analytics and ML, we needed Velox to be easy to build and package on multiple platforms, and support a wide range of hardware architectures. If we are supporting all these platforms, we also need to ensure that Velox remains fast and regressions are caught early.| velox-lib.io
This blogpost is part of a series of blog posts that discuss different features and optimizations of the simple function interface in Velox.| velox-lib.io
One of the queries shadowed internally at Meta was much slower in Velox compared to presto(2 CPU days vs. 4.5 CPU hours). Initial investigation identified that the overhead is related to casting empty strings inside a try_cast.| velox-lib.io
This blogpost is part of a series of blog posts that discuss different features and optimizations of the simple function interface.| velox-lib.io