Would be interesting if they could offer it as a separate compute type that doesn’t have Spark running at all. Then you wouldn’t have that overhead on the node and could do “simple” Python and SQL things on that compute type.
I was thinking more for the All Purpose clusters where you might have some SQL workloads, but most of it could be pure Python and the Spark availability provides little to no benefit.
Yeah but if they had a clicky easy way to do it, with a Duck logo and donated to the foundation... could be a pretty good marketing campaign that may make people like us more biased towards them and away from Snowflake.
Comments
It's still not that quick though in terms of latency. Loads of companies use Cube with Databricks to overcome this.