Profile avatar
zachrenwick.work
Data analytics engineer #SQL #DataModeling #Analytics #BigQuery #WebAnalytics zachrenwick.work
51 posts 125 followers 147 following
Regular Contributor
Active Commenter

Anyone here used Looker (as a semantic/metrics layer) and Tableau together? How is the native Looker to Tableau connector?

Is there going to be a Small data meetup in Vancouver or Seattle this year? @duckdb.org @motherduck.com Please make it happen!

Another analytics engineering (AE) LinkedIn bash post. Hard to not react (my current job) but having an AE title does not assume you will build a spaghetti data architecture. There are tons of data engineers who create perfectly optimized data models that are used by no one.

BigQuery just released UNION ALL BY NAME It solves the issue of perfectly aligning each field properly in SQL. As long as the fields have the same name you're good!

Complex measure calculations in raw SQL is driving me insane. Yes we want to transform data in data warehouse/version controlled code, but these final mile calculations are so much easier/flexible to complete in a BI tool. Really missing Power BI right now lol

I've never been to a data conference. In 2025 I'm making it happen (even if I have to personally pay for it, or if it's just a small data meetup in Seattle)

Perpetually flip flopping on self serve analytics Some days it feels like it could be the holy grail - enabling downstream users to do their own exploration and analysis on governed metrics. Fewer ad hoc requests! Reporting automation! Other days: So what? Does that actually move the needle?

Some of you have never had to fight for BigQuery slots and it shows

Google Analytics 4 contains a known traffic source misattribution bug in the raw BigQuery export. I fixed it with a few lines of code:

Pretty easy to verify your bluesky account if you already have a domain. I just did mine using the below guide... does this mean I get a blue check now??? bsky.social/about/blog/4...

It’s a damn tough market out there right now, particularly in Vancouver tech scene. You never know where you could end up, I nearly applied for some data jobs at this company before getting my current role dailyhive.com/canada/bench...

New [mountain] bike day! This thing is a beast, looking forward to the bigger brakes and suspension for north shore riding

Standing Desk Celebrates 4th Year At Lowest Possible Setting

@zachrenwick-data.bsky.social's resume is on @evidence.dev 🤯 zachrenwick.work

Sorry, how much?!? techcrunch.com/2024/12/17/d...

Soemthing fundamentally wrong with Strava’s year in sport (Spotify wrapped, but for logging your exercise) is that December isn’t done yet and they’ve sent it out.

Am I right in thinking one of the major benefits of open table formats is being able to bring your own compute (even if it's local?) I'm envisioning a future where I can load my development BigQuery/Iceberg/Tabular tables using my local machine. Anyone doing this already?

Sick of waiting for inconsistent Google Analytics 4 data to land in BigQuery? No problem, just turn on Fresh Daily export. Data has been ready by 3am every day! Oh yeah and one small detail, you have to pay >$50k for GA360 🤷‍♂️

Crazy that I started my data career as a project accountant with PowerPivot in Excel and after 8 years I find myself going back to it for quick/scrappy analytics PowerPivot provides the model/relationships/metrics and I connect to BigQuery datasets with ODBC.

Thinking about open sourcing the SQL code I used to integrate / join GA4 and Google Analytics Universal Analytics data to create our reporting datasets. It was a major unlock for our organization to have continuous web metrics before and after the cutover. Would that be useful for any of you?

Has anyone implemented write-audit-publish patterns with dagster? Would be great to see some examples

Dagster is awesome. Google Analytics 4 data landing time varies significantly every day, so we're building an automated sensor to trigger runs whenever the data lands vs the existing cron schedule. This will save me from having to manually trigger additional runs when data is late!