Microsoft Business Applications Summit (MBAS) turned out to be a veritable goldmine for Power BI. The announcements are out in force, and Marc Lelijveld (Twitter|Blog) has penned an excellent summary of the features. I’d like to give my two cents on two of the features I personally find the most exciting: hybrid tables and streaming datasets. Hybrid Tables Let’s start with hybrid tables - they’re what we’ve been wishing for ever since Direct Query and Composite Models came out. This will give us the ability to combine imported data with Direct Query data in a seamless fashion. I have a use case for it right now: I have an application that logs a lot of data from an integration platform.
Thanks to Kendra Little’s blog post Moving from Wordpress to an Azure Static Site with Hugo I was inspired to try the same. Since I’ve already experimented with Hugo for some time, the move to Azure Static Sites was dead simple - and I love the GitHub integration. I save my markdown file, I push to GitHub, and a few minutes later my changes are up there. Fantastic!
It’s T-SQL Tuesday! I’m trying to get back on the blogging bandwagon, and for me, that’s about as fun as pulling teeth. I have the utmost respect for the people who can blog all day and at the same time make it look easy (I know it isn’t), but me, I just have to slog through. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts.
I’ll go straight to the point: I think live demos in technical sessions are a waste of time. No, no, hear me out, I’ll explain what I mean. Even more importantly, I’m curious to hear dissenting views. I’ll start with a little bit of background so you’ll understand where I’m coming from. I’m a Microsoft Certified Trainer, and I’ve been training people professionally for over 20 years. For me, it’s all about the penny dropping for the learner. To put it simply: if you don’t get what I’m trying to teach you, that’s on me. That’s on me, and I need to do better to help you understand.
It’s T-SQL Tuesday! It has been a long time since I last participated, but this month really struck a chord. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post. The Ask This month’s T-SQL Tuesday is hosted by James McGillivray ( Blog | Twitter ). James wants to know how we’re managing to give ourselves some breaks, to keep ourselves from going even more bonkers.
I just had an absolute blast presenting at the Data Platform Discovery Days - both the European and the US edition! For the US edition I presented “Azure Machine Learning for the Absolute Beginner”, a session looking at machine learning in general, walking through Azure Machine Learning and giving several examples of machine learning in action - in both expected and unexpected places! The European edition asked for “Building an Empire - Implementing Power BI Step by Step”, a session on Power BI, datasets and dataflows.
In part 1 of this blog series I outlined the gear we use to record the podcast. The second part was all about actually recording content. The third was a wall of text on post-processing. After getting back from Oslo and the Nordic Infrastructure Conference, it is now time to finish off the series with an outline of how I publish and push the podcast on the different social media platforms. Gear Recording Post Processing Publication Base camp After working off the blog for a bit, we started using the podcasting platform Pippa in January of 2018. (Pippa was subsequently absorbed into Acast).
In part 1 of this blog series I outlined the gear we use to record the podcast. The second part was all about actually recording content. It’s now time to dive into the third and most technical part – post-processing. Gear Recording Post Processing Publication select * from foo into bar As a quick recap you might remember that the starting point for this step is the raw audio files. I will typically have one file per host plus the recording of the Teams meeting. Let’s start with Teams first, as we need a way to extract the audio feed from the video so I can use that for lining up my other audio files.