[{"content":"I wanted a clean publish pipeline for my Hugo blog hosted on a Netcup VPS. The goal was simple: push to the main branch in Azure DevOps, and the live site update automatically. No manual builds, no FTP.\nThe VPS already runs Pangolin, which bundles Traefik as its reverse proxy. That meant I had an existing Docker network and a working cert resolver to plug into. The missing pieces were the Hugo serving container, a deploy user with SSH access, and an Azure Pipeline to tie it together.\nPrerequisites A Hugo repository hosted in Azure DevOps Repos A VPS with Pangolin already running (Traefik and the pangolin Docker network are assumed to exist) SSH root access to the VPS The Hugo theme installed as a git submodule (relevant for the pipeline checkout step) Setting up the VPS Directory structure mkdir -p /apps/hugo/public The public directory is the rsync target from the pipeline and the webroot for nginx. A placeholder index.html prevents nginx from returning a 403 before the first real deploy lands.\necho \u0026#34;\u0026lt;html\u0026gt;\u0026lt;body\u0026gt;Coming soon\u0026lt;/body\u0026gt;\u0026lt;/html\u0026gt;\u0026#34; \u0026gt; /apps/hugo/public/index.html Docker Compose Create /apps/hugo/docker-compose.yml:\nservices: hugo: image: nginx:alpine container_name: hugo restart: unless-stopped volumes: - ./public:/usr/share/nginx/html:ro networks: - pangolin networks: pangolin: external: true The container joins the pangolin Docker network so Traefik can reach it by container name. No labels are needed here. Pangolin\u0026rsquo;s Traefik does not use the Docker provider, so labels on containers are ignored entirely. Routing is configured through Traefik\u0026rsquo;s file provider instead, covered in the next section.\nStart the container:\ncd /apps/hugo docker compose up -d DNS Add an A record pointing to the VPS IP before starting the container. Let\u0026rsquo;s Encrypt needs the domain to resolve correctly for the HTTP-01 challenge. Add a CNAME for the www subdomain at the same time.\nhendrickxconsulting.com A 213.109.161.149 www.hendrickxconsulting.com CNAME hendrickxconsulting.com Traefik configuration Pangolin\u0026rsquo;s Traefik uses a file provider and an HTTP provider that polls Pangolin\u0026rsquo;s own API. There is no Docker provider. The routing for the hugo container needs to be added manually to /apps/pangolin/config/traefik/dynamic_config.yml.\nAdd the following to the middlewares, routers, and services sections of that file. The www router includes a redirect to the bare domain so both addresses work but canonicalise correctly.\nhttp: middlewares: # ... existing middlewares ... redirect-to-non-www: redirectRegex: regex: \u0026#34;^https://www\\\\.hendrickxconsulting\\\\.com(.*)\u0026#34; replacement: \u0026#34;https://hendrickxconsulting.com${1}\u0026#34; permanent: true routers: # ... existing routers ... hugo-router: entryPoints: - websecure middlewares: - security-headers rule: Host(`hendrickxconsulting.com`) service: hugo-service tls: certResolver: letsencrypt hugo-router-redirect: entryPoints: - web middlewares: - redirect-to-https rule: Host(`hendrickxconsulting.com`) service: hugo-service hugo-router-www: entryPoints: - websecure middlewares: - security-headers - redirect-to-non-www rule: Host(`www.hendrickxconsulting.com`) service: hugo-service tls: certResolver: letsencrypt hugo-router-www-redirect: entryPoints: - web middlewares: - redirect-to-https rule: Host(`www.hendrickxconsulting.com`) service: hugo-service services: # ... existing services ... hugo-service: loadBalancer: servers: - url: http://hugo:80 Traefik watches this file and reloads automatically. No restart needed.\nGeoblock Pangolin ships with a geoblock middleware applied globally on the websecure entrypoint in traefik_config.yml. That is fine for tunnelled homelab services but wrong for a public blog. The fix is to remove it from the global entrypoint and apply it explicitly only to the Pangolin-specific routers.\nIn /apps/pangolin/config/traefik/traefik_config.yml, update the websecure entrypoint:\n# Before websecure: http: middlewares: - crowdsec@file - geoblock@file # After websecure: http: middlewares: - crowdsec@file Then add geoblock@file explicitly to each Pangolin router in dynamic_config.yml that should stay Belgium-only:\nnext-router: middlewares: - security-headers - geoblock@file api-router: middlewares: - security-headers - geoblock@file ws-router: middlewares: - security-headers - geoblock@file The hugo routers get security-headers only. CrowdSec still applies globally via the entrypoint, so bot protection stays in place everywhere.\nThis change touches traefik_config.yml, the static config, so a restart is required:\nsudo docker restart traefik Deploy user The pipeline deploys over SSH as a dedicated deploy user rather than root. Create the user and set up the correct directory permissions:\nuseradd -m -s /bin/bash deploy mkdir -p /home/deploy/.ssh chmod 700 /home/deploy/.ssh chown -R deploy:deploy /home/deploy/.ssh chown -R deploy:deploy /apps/hugo/public SSH is strict about ownership. The .ssh directory and authorized_keys file must be owned by the user, not root, even if the directory was created as root. Getting this wrong is the most common reason key auth silently falls back to a password prompt.\nSSH key setup Generate a keypair on a local machine, not the VPS:\nssh-keygen -t ed25519 -C \u0026#34;hugo-deploy\u0026#34; -f ~/.ssh/hugo_deploy -N \u0026#34;\u0026#34; Install the public key on the VPS using ssh-copy-id:\nssh-copy-id -i ~/.ssh/hugo_deploy.pub deploy@213.109.161.149 Verify it works before continuing:\nssh -i ~/.ssh/hugo_deploy deploy@213.109.161.149 # should open a shell with no password prompt Also install rsync on the VPS. It is not always present by default:\napt-get install -y rsync Azure DevOps pipeline Secure file The private key is stored as a Secure File in Azure DevOps rather than as a plain variable. This keeps it as an actual file on the runner rather than requiring base64 gymnastics.\nGo to Pipelines → Library → Secure files Click + Secure file and upload ~/.ssh/hugo_deploy (the private key, no .pub) Click the file after upload, enable Authorize for use in all pipelines, then save Variable group In Pipelines → Library → Variable groups, create a group named hugo-deploy-vars with three variables:\nName Value DEPLOY_HOST IP-ADDRESS-OF-VPS DEPLOY_USER deploy DEPLOY_PATH /apps/hugo/public/ Pipeline YAML Create azure-pipelines.yml at the root of the Hugo repository:\ntrigger: branches: include: - main pool: vmImage: ubuntu-latest variables: - group: hugo-deploy-vars steps: - checkout: self submodules: recursive fetchDepth: 0 - task: DownloadSecureFile@1 name: sshKey displayName: Download SSH deploy key inputs: secureFile: hugo_deploy - script: | HUGO_VERSION=0.160.1 wget -q https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-amd64.tar.gz tar -xzf hugo_extended_${HUGO_VERSION}_linux-amd64.tar.gz hugo sudo mv hugo /usr/local/bin/ displayName: Install Hugo - script: | hugo displayName: Build Hugo site - script: | chmod 600 $(sshKey.secureFilePath) rsync -avzr --delete \\ -e \u0026#34;ssh -i $(sshKey.secureFilePath) -o StrictHostKeyChecking=no\u0026#34; \\ public/ \\ $(DEPLOY_USER)@$(DEPLOY_HOST):$(DEPLOY_PATH) displayName: Deploy via rsync submodules: recursive is needed if the theme is a git submodule, which is the recommended way to manage PaperMod. fetchDepth: 0 is needed for Hugo\u0026rsquo;s .GitInfo and lastmod features to work correctly. The Hugo version is pinned explicitly rather than using latest, because theme compatibility is sensitive to version jumps and it is better to control upgrades deliberately.\nThe --delete flag on rsync ensures that posts or pages removed from the repo also disappear from the live site.\nCreating the pipeline Go to Pipelines → Pipelines → New pipeline Select Azure Repos Git, then select the Hugo repository Select Existing Azure Pipelines YAML file, set branch to main and path to /azure-pipelines.yml Click Run On the first run, Azure DevOps will pause and ask for permission to access the secure file. Click View → Permit → Permit. This is a one-time step per pipeline.\nGotchas Do not use hugo --minify. Hugo\u0026rsquo;s JSON minifier runs over JSON-LD structured data blocks and rejects invalid JSON that older versions silently ignored. PaperMod has at least one such block. The flag is not worth the debugging overhead for a blog.\nEnd result Every push to main in Azure DevOps triggers a build and an rsync deploy to /apps/hugo/public/ on the VPS. The nginx container serves whatever is on disk, no restart required, changes appear immediately. Traefik handles TLS termination, HTTP-to-HTTPS redirects, and www-to-bare-domain canonicalisation via the file provider config.\nBoth hendrickxconsulting.com and www.hendrickxconsulting.com get their own Let\u0026rsquo;s Encrypt certificates. The www variant redirects permanently to the bare domain. CrowdSec applies globally; geoblock applies only to the Pangolin tunnel routes, leaving the blog publicly accessible worldwide.\nThe full pipeline from git push to live site takes around 90 seconds on a Microsoft-hosted runner.\n","permalink":"/posts/hugo-blog-to-vps/","summary":"\u003cp\u003eI wanted a clean publish pipeline for my Hugo blog hosted on a Netcup VPS. The goal was simple: push to the \u003ccode\u003emain\u003c/code\u003e branch in Azure DevOps, and the live site update automatically. No manual builds, no FTP.\u003c/p\u003e\n\u003cp\u003eThe VPS already runs Pangolin, which bundles Traefik as its reverse proxy. That meant I had an existing Docker network and a working cert resolver to plug into. The missing pieces were the Hugo serving container, a deploy user with SSH access, and an Azure Pipeline to tie it together.\u003c/p\u003e","title":"Deploying a Hugo blog from Azure DevOps to a self-hosted VPS"},{"content":"Intro The feature discussed in this blog post is experimental and subject to changes. Use at your own risk.\nMicrosoft has recently introduced a new experimental feature to directly store the modifications to your canvas apps to Git. This offers multiple advantages:\nMultiple people can now work on the same power app at the same time. All changes are stored in Git and once you save, your colleague can synch the changes and pick off from where you left off. You have version control for your app. All saves are stored as a Git commit, meaning you have a full trace of all changes and modifications. If you mess up, you can go back easily. You can link this to CI/CD pipelines to automatically build and deploy versions of your app. Using Git version control really opens up Power Platform development to mature and proven good practices in software development.\nA view on the repository for one of my Power Apps in GitHub.\nEnabling and using version control To enable the experimental feature and to get a view on how to use it, I’ll refer you to the official Microsoft documentation about it here:\nhttps://learn.microsoft.com/en-us/power-apps/maker/canvas-apps/git-version-control\nThe most important steps are:\nEnabling the feature in the Power Apps Settings Create a repository on for example GitHub for your canvas app Connect your app to the Git repository Generate a Personal Access Token for the repo that allows Power Apps to read/modify the repository Profit! The annoying part… always authenticating While this feature offers a lot of benefits, one thing that bothered me while using it was that each time you open up the canvas app, you’d be presented with this:\nSign in… again!\nMicrosoft does not store your Git credentials between sessions, which means you’ll have to authenticate each time you open the Power Apps editor and each time you’ll need to provide your username and the personal access token. This token is often a long autogenerated string that’s impossible to remember, so you might end up copy pasting this PAT from somewhere each time you open up the app.\nBitwarden to the rescue It’s a good practice to use a password manager. They allow you to create a random password per service that you use on the internet and store all of these so you don’t have to remember them.\nYou can use a password manager to store your Git credentials as well. Any of them will do, but my personal preference goes out to Bitwarden. Bitwarden is open source, free and easy to use while not compromising on security. It has browser extensions for all main browsers and iOS and Android apps.\nOnce you’ve set up Bitwarden, installed the browser plugin and created an account, we can add the Git credentials. The easiest way to do this is to browse to your canvas app, or Dataverse for teams canvas app, that has the experimental Git feature enabled. You’ll be greeted with the sign in prompt from the screenshot above.\nHere’s the step by step:\nClick on the browser extension and add a new login by clicking on the + sign. This will create a new login. Give the login a name. I would suggest to name it after your Power app because the login is specific to this app. Fill in the username: your GitHub username Fill in the password: your Personal Access Token The URI is important as well. As you can see, it automatically stores your current URI for the new login. This URI contains the ID’s to your app. It only prefills the correct URI if you start this process when you’re on the app designer for the correct app in your browser. Click Save to store the login Now Bitwarden will automatically detect when you’re on a page for which it has a stored login. The icon of the browser extension changes to indicate how many potential logins are available for this page. Without leaving the Power Apps editor, you can now click on the browser extension, select the login you’ve just added and it will prefill both the username and the PAT.\n","permalink":"/posts/powerapps-github-bitwarden/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cblockquote\u003e\n\u003cp\u003eThe feature discussed in this blog post is experimental and subject to changes. Use at your own risk.\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003eMicrosoft has recently introduced a new experimental feature to directly store the modifications to your canvas apps to Git. This offers multiple advantages:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eMultiple people can now work on the same power app at the same time. All changes are stored in Git and once you save, your colleague can synch the changes and pick off from where you left off.\u003c/li\u003e\n\u003cli\u003eYou have version control for your app. All saves are stored as a Git commit, meaning you have a full trace of all changes and modifications. If you mess up, you can go back easily.\u003c/li\u003e\n\u003cli\u003eYou can link this to CI/CD pipelines to automatically build and deploy versions of your app.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eUsing Git version control really opens up Power Platform development to mature and proven good practices in software development.\u003c/p\u003e","title":"Power Apps Github login using Bitwarden"},{"content":"Intro Microsoft offers a set of sample apps, built with the Power Platform, that you can install for your team in MS Teams. These apps serve as fully functioning apps, that you can use as they are, but mainly they can be used as a great example on how Microsoft wants us to build Dataverse for Teams apps. They are the most production-ready apps out there, built by Microsoft itself.\nThese apps usually contain multiple canvas apps, some Power Automate flows and sometimes even a Power Virtual Agent component.\nI use them as inspiration:\nLooking for a way to support multiple languages in PowerApps: The examples got you covered. Looking for a way to split functionalities between a general user application and a management application: The examples got you covered. Looking for a way to use consistent styling and theming in your PowerApp: The examples got you covered. Looking to build a fancy loading screen: The examples got you covered. You can find the sample apps and how to install them in your Teams environment here. Microsoft currently has a collection of 10 sample apps. A couple of the most prominent are shown below. The link also describes how you can install these sample apps to try them out for yourself.\nSome of the Dataverse for Teams sample apps.\nEditing a sample app After you’ve installed a sample application, you are able to use it in Teams, or you can edit the application straight from the Power Apps app in Teams.\nWarning: these apps are complex and can be very overwhelming.\nWhen editing the app, you can modify whatever you want or extract parts of it that you want to reuse in your own Power App development. This is were things can get difficult: The Power Apps studio is not like VS Code. There’s no way to jump to for example the definition of a variable to inspect it and see what’s really behind it. You need to find where the variable is declared yourself. In the screenshot below, we see that the styling of the Ideas label for property FocusedBorderThickness is coming from:\ngblAppStyles.Label.FocusedBorderThickness But where is gblAppStyles defined? I went looking in the App.OnStart property, but no luck. Rather than browsing through all possible properties to find where it might be defined, it’s easier to find it when you unpack the solution in VS Code and can just execute a Search on the entire application.\nA screenshot of the “Manage Ideas” app loading screen in the Power Apps Studio\nFind the MsApp file So I wanted to download the Employee Ideas solution and unpack it to open it in VS Code. I’ve read before that Dataverse for Teams apps do exist in their own solution, but for some weird Microsoft logic, they’re only visible in the Solutions section of the Power Automate Portal. Once you log in to https://flow.microsoft.com you can see your Teams under Environments in the upper right corner. Select your Team and go to Solutions. Bingo! The Employee Ideas solution is there. However, the sample apps are installed as Managed Solution, and we’re not able to export these. Only unmanaged solutions can be exported.\nUnable to export the managed solution from the Power Automate Portal\nSo we’re looking for another way to get the solution exported… Luckily, Microsoft is using GitHub for most of its documentation and resources. So after some searching, I stumbled upon the OfficeDev GitHub account. If you look there for the name of the Sample App, you’ll quickly find a GitHub repo for it. Some examples below:\nSample App Repo Inspections https://github.com/OfficeDev/microsoft-teams-apps-inspection/releases Bulletins https://github.com/OfficeDev/microsoft-teams-apps-bulletins/releases Employee Ideas https://github.com/OfficeDev/microsoft-teams-apps-employeeideas/releases On the releases page of the repo, you’ll find the DataverseSolution. This is the one you’ll need to download.\nReleases page of the EmployeeIdeas GitHub Repo\nOnce you download and extract it, you’ll find the *.msapp files. There can be multiple if the Sample App contains for example both the main app and a manager app. Now let’s see how we can find what’s inside.\nThe *.msapp files\nUnpack using VS Code VS Code is a very lightweight and extensible code editor. Microsoft has created an extension called Power Platform Tools that contains a CLI for managing Power Platform assets. You can install it through the extension menu of VS Code.\nVS Code Power Platform Tools extension\nOnce installed, you can use the terminal in VS Code to unpack a *.msapp file. Put the file in a folder and browse to that folder in the terminal. Use the following command to unpack the file, with the name of the msapp file and a link to a directory where the unpacked version needs to be stored:\npac canvas unpack --msapp **NAME_OF_THE_MSAPP** --sources **DIRECTORY_TO_STORE_SOURCE** The result should look something like the below screenshot. You can open the directory in which you’ve unpack the msapp file in VS Code.\nUnpack results\nConclusion Once you’ve opened the directory in VS Code, you’ll see what’s really behind a canvas application. It’s a collection of json and yaml files that store the entirety of the configuration of your Power App. Under the Src folder, you can find the definition of the screens.\nThe Loading Screen YAML file\nNow, you can easily use the VS Code search functionality to look for any variables that you weren’t able to find back in the Power Apps Studio. In our example, let’s look for the gblAppStyles variable. We can see that the variable is defined in the OnVisible property on the Loading Screen.\ngblAppStyles is defined in the OnVisible property on Loading Screen.fx.yaml\nBeing able to unpack a solution like the Sample Apps offers a tool to learn from these well-built and professional Power Apps. It makes it easier to find your way through the setup and understand how these apps are built.\nIf you know a better way to achieve the same result, let me know!\n","permalink":"/posts/dataverse-for-teams-vscode/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003eMicrosoft offers a set of sample apps, built with the Power Platform, that you can install for your team in MS Teams. These apps serve as fully functioning apps, that you can use as they are, but mainly they can be used as a great example on how Microsoft wants us to build Dataverse for Teams apps. They are the most production-ready apps out there, built by Microsoft itself.\u003c/p\u003e","title":"Easily open sample Dataverse for Teams apps in VS Code"},{"content":"Intro So you decided to try to lose some excess weight. Good for you! You browse around on the internet and find thousands of studies and blogposts and advice on how to do so. But losing weight is a personal journey in which you need to find a balance between diet and not losing your mind.\nI’ve drafted some rules that worked for me. Again, it’s a personal thing, but to me these rules made a diet manageable, measurable and effective. I used these rules to lose 15 Kg over the course of 4 months. And they helped me get rid of weight on at least 3 occasions. (We’ll talk about why it always came back at the end…) It’s a set of tips and tricks, do with it what you want, and good luck!\nThe key science to which all of these rules build up is simple: If you’re burning more calories than you are taking in, you lose weight. That’s it. All of these are are set up to decrease the calories you take in, so that you automatically burn more.\nThe rules Let’s start of with 3 key rules:\nKey rule 1: Track you calorie intake. I’ve used MyFitnessPal for this. It has a huge database of foods and I was able to find nearly everything I ate in there. You can set a goal (mine was a calorie intake of less than 1500, which is very strict) to keep you from overeating but the main benefit you get from tracking your intake is that you create a sense of which foods are worth it to eat, and which ones aren’t. If you eat a chocolate waffle of 500 kCal, you’ve snacked your way through a third of your daily allowed intake. Is that really worth it, or do you decide to go for an apple instead? Getting a view on the food you eat is essential to get an understanding of what you can cut out of your diet. Depending on how fast you want to lose weight, you should set a different daily goal of calorie intake. 1500 kCal is an ambitious number to stay below, so see for yourself what works. Key rule 2: Weigh yourself daily in the morning. Weight fluctuates. Sometimes you’ve splurged with a big BBQ or your body is retaining more water for a certain reason. If you only weigh yourself once a week, and that happens to be on a morning when you’ve just had a bad day yesterday, your measurement of the full week is looking worse than reality, which is a disaster for your motivation, because it looks like a full week of dieting didn’t amount to anything! By weighing yourself every morning, you get a lot of ups and downs, but at the same time your getting more data points. It doesn’t matter if a single data point is higher than the day before, as long as the trend is going down. So every morning, first thing when you wake up, get on that scale and track it! Key rule 3: Drink a lot of water. Especially before dinner, if dinner is the heaviest course of the day for you. I drink 2 glasses of water before dinner. It makes me feel full faster, and prevents me from overeating and going for an additional, unnecessary second or third plate. These 3 rules work together and form the basis of your motivation: You track your weight daily, you figure out what’s worth eating and what isn’t and you stop overeating by drinking more water. Everyday you step on the scale, and the trend goes down. This motivates you to keep on going, one day at a time.\nThe next rules are more concrete changes in diet:\nRule 4: Exercise and track it. I like to run. I track all my running activities with Strava and have this connected to MyFitnessPal. So on each day that I go for a run, the calories burned are added to my allowed calorie intake of the day. This creates a margin to treat yourself, or have a cheat day. You’ve compensated the cheat day by running or working out, so you’ll feel less impact on the scale the next day. Rule 5: No calories from drinks. I mainly drink water, coffee (with no sugar), or diet soda. I found that calories from sugary drinks are just adding to the calorie counter without having any benefit. There are a lot of diet soda’s, so you can switch to those, or even better, to water. Rule 6: Small lunches and breakfast: When I limit myself to only 1500 kCal per day, you can’t have a 600 kCal breakfast and lunch. I never got into the “breakfast is the most important meal of the day” mindset, so I eat a yoghurt in the morning, and a couple of slices of bread with lean meat over lunch. This gives me some more room for a bigger dinner. Rule 7: No beer. Bear, and especially the heavier ones, contain a lot of calories. If you do need a glass of alcohol after a long day, have a glass of wine, or make yourself a gin and tonic with diet tonic. Rule 8: Ketchup instead of mayonnaise. The rule says it all. Mayonnaise, or a lot of other sauces, contain a lot more fat and calories than ketchup. No sauce is even better, but if you need something, ketchup is a good alternative. Rule 9: “Healthy” late night snacks. I’m addicted to crisps. Nearly every evening while watching television. These are empty calories that don’t make you feel full. I replaced this with low calorie snacks like popcorn or “water ice”. If you do need the crisps, buy small packs instead of starting from a large bag. This allows you to do some portion control. Rule 10: No snacks during the day. Try to stick to your 3 meals a day, and maybe a light snack in the evening. These will already fill up your daily calorie goals, so you really can’t afford sugary snacks during the day. If you need something to keep you going, go for some fruit. Why did it come back? I know you’re thinking: “Well great, but the weight always came back because you had to diet already 3 times!”, and you’re right. So far I’ve consistently failed to keep the weight off. Looking back at the rules, it’s easy to see what went wrong, because I started falling back into old habits after I reached my target weight.\nI stopped measuring calories, but that’s ok. The process thought me what food is worth eating and what isn’t. The main issue is that I stopped weighing myself. You lose track of small changes in weight, and before you know it, next time you step unto the scale, it’s a couple of Kg’s more. So my ambition is to keep weighing myself consistently. You’ll notice smaller changes and can reflect on where these come from. I also stopped drinking water, started eating crisps and mayonnaise again and fell into all of my old habits.\nBut I truly believe that if I keep on weighing myself, I can keep the weight down and will be more mindful of my bad habits.\n","permalink":"/posts/10-rules-for-weight-loss/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003eSo you decided to try to lose some excess weight. Good for you! You browse around on the internet and find thousands of studies and blogposts and advice on how to do so. But losing weight is a personal journey in which you need to find a balance between diet and not losing your mind.\u003c/p\u003e\n\u003cp\u003eI’ve drafted some rules that worked for me. Again, it’s a personal thing, but to me these rules made a diet manageable, measurable and effective. I used these rules to lose 15 Kg over the course of 4 months. And they helped me get rid of weight on at least 3 occasions. (We’ll talk about why it always came back at the end…) It’s a set of tips and tricks, do with it what you want, and good luck!\u003c/p\u003e","title":"10 Rules for Weight Loss"},{"content":"Intro This is a follow-up post on Strava: Analyse your fitness using Power BI . Go read that post to understand the basics of the PowerBI connector for Strava and how to use it.\nHaving a group of people to run together or to track your running activities together is a great motivator to get outside and exercise. Strava knows this and supports this by offering clubs. You can create a club for your running squad and get a view on your friends’ activities. It gamifies the running experience, with a leaderboard of who ran the most in the last week. You can even use it to plan runs together.\nThe dashboard on Strava for clubs is limited. It only shows you the graph of everyone’s current week progress, and a leaderboard for the previous week. You can see an example of this below.\nStrava Club Overview\nThis post digs deeper into the Strava PowerBI data connector to see what data we can fetch from the Strava API to get more insight into our club actvitities.\nStrava PowerBI data connector The PowerBI connector by Kasper on BI is limited to 2 Strava API endpoint: Athlete and Athlete Activities. However, the Strava API allows us to connect to our club as well.\nI’ve modified the connector code to include the following 2 endpoints:\n/clubs/{id}: Returns a given club using its identifier. /clubs/{id}/activities: Retrieve recent activities from members of a specific club. The emphasis is on recent. To respect the privacy of your club members, the information that you can get is limited. The modified connector can be found on my GitHub. See my previous article for instructions on how to build and use the connector.\nFinding and setting the club ID To fetch the information of your club, you’ll need to set the club ID in the same way as we’ve set the client_id and client_secret. Currently the connector only works with one club at a time.\nTo find the club ID, browse to the club page on Strava and look at the URL. On the landing page of the club, the URL will probably still contain the club name instead of the ID, but if you click on, for instance, recent activities, the URL will change to include the club number as shown below:\nThe Strava club ID in the URL\nAdd a file to the connector project called club_id and paste in the club id. Make sure to not include any spaces.\nThe added club_id file\nThe club_id will be fetched by the connector using the code below:\nclub_id = Text.FromBinary(Extension.Contents(\u0026#34;club_id\u0026#34;)); I won’t go into the other code changes. They are basically a copy/paste with some modifications.\nData in PowerBI When you build the code and update the connector in PowerBI, you’ll be able to get more data:\nThe new Club and ClubActivities tables\nA view on the data in the tables:\nFields in the Club Tables\nClub: For the club itself, we retrieve a lot of metadata, ranging from the type, cover photo, number of members, name, profile picture, type of activities,… You can use this to provide some background on the club in your dashboards ClubActivities: The activities of your club members are limited to the last month. In addition, the API operation returns an object SummaryActivity which doesn’t contain the full details of the activity. We’re limited to the name of the athlete and the activity, along with the distance, moving time, elapsed time, total elevation gain and type of activity. We don’t even have the start date and time of the activity. Overview of my activities in the club\nSome insights Given it’s limitations, there’s not a whole lot of insights we can gather from this data. I did build a small dashboard showing:\nThe club profile picture The club name Number of members Over the last month, the sum of all distances per member: Who ran the furthest? Over the last month, the average speed per member: Who ran the fastest? You can play around with the elevation and moving time, but without a start date and time, that’s about it.\nThe Club dashboard\nConclusion It was a fun exercise to extend the Strava PowerBI connector, but unfortunately the API to get information about your club members is fairly limited. This is understandable because you’re fetching other people’s data.\nTo get a more in depth insight into your club performance, we’d have to build an app that allows all club members to authenticate and grant permission to use their activities. That way, we’d have the full data set of all members and we can go a lot deeper into the analysis.\nBut that’s a topic for another post!\n","permalink":"/posts/strava-club-powerbi/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cblockquote\u003e\n\u003cp\u003eThis is a follow-up post on \u003ca href=\"/posts/strava-powerbi\" title=\"Strava: Analyse your fitness using Power BI \"\u003eStrava: Analyse your fitness using Power BI \u003c/a\u003e. Go read that post to understand the basics of the PowerBI connector for Strava and how to use it.\u003c/p\u003e\n\u003c/blockquote\u003e\n\u003cp\u003eHaving a group of people to run together or to track your running activities together is a great motivator to get outside and exercise. Strava knows this and supports this by offering clubs. You can create a club for your running squad and get a view on your friends’ activities. It gamifies the running experience, with a leaderboard of who ran the most in the last week. You can even use it to plan runs together.\u003c/p\u003e","title":"Strava: Analyse your Strava Club in PowerBI"},{"content":"Intro Power Platform is an amazing set of low-code products. Although most people with a work account and a Microsoft 365 license can already make use of a lot of its functionalities, it can be difficult to find a clean environment to learn and experiment with the products. In many organizations, the IT department, rightfully, limits most of these tools.\nSo what can you do if you do want to experiment, but your company doesn’t allow it or you don’t have a work account? Luckily Microsoft has got you covered.\nIn this blog, I’ll explain how you can get (nearly) full access to a Microsoft 365 tenant and power platform, using the Microsoft developer programs.\nMicrosoft 365 Developer program First of all, you need to enroll in the Microsoft 365 Developer program. This program grants you lifetime access to a development Microsoft 365 E5 license, as long as you don’t abuse it (read: use it for production purposes) and as long as you stay active on it.\nThe program is very generous in what it provides, including:\n25 user licenses Teams Sample data When you enroll, you create a new tenant specifically for you. You can see this as a whole new Microsoft 365 work environment, just as you would create when you buy a license for yourself or your company. The steps to follow are:\nClick on “Join Now” on the Microsoft 365 Developer program website. Login with your Microsoft account. This doesn’t need to be a work or school account, so any Microsoft account will do. Select your country, company and language preference. You’ll be asked some questions about how you intend to use it and what apps you’re most interested in. It doesn’t matter what you select here. Then you can choose instant sandbox or configurable sandbox. If you’re just getting started, the instant sandbox will be preconfigured with users and teams, so you’ll be able to start fast. Set up your sandbox admin account. This is an important step! Remember your admin username and password, because you’ll need it later. You’ll need to verify your phone number to proof you’re a real human. After that you’re good to go and your developer subscription will be set up. Once the setup is done, you’ll arrive at the dashboard. The most important information is shown in the box below:\nIf you sign in to office.com with your administrator account and browse to all apps, you’ll see a huge amount of apps at your disposal!\nThe created users and admin account all have access to Teams, Sharepoint and a lot of other M365 goodies, including Power Apps! However, to my understanding, the M365 developer plan will not allow full dataverse or premium connector access. On top of that, you don’t have your own environment, only the default environment.\nTo fix this, we need to go one step further.\nPower Apps Developer Plan The Power Apps Developer Plan is a new version of the previous community plan. This requires a work or school account. So what if you don’t have this? Luckily we just created a work tenant in the step before!\nThe plan grants you access to:\nPower Apps including premium connectors Power Automate Dataverse Just as the Microsoft 365 plan, this one does not expire as long as you actively keep using it and don’t abuse it.\nSome limitations of the plan:\nIt cannot be used for production purposes, this is strictly for building, testing and learning. 750 flow runs per month 3GB maximum database size No AI builder rights, except for a limited time trial No Dynamics 365 apps are available To enroll for the developer plan:\nClick on the “Get started free” button You need to enter a work or school email address. Fill in the admin account of our M365 developer plan. Run through the wizard and you’ll be automatically redirected to the power apps website. The end result – Your FREE learning environment It can take some time to provision the new environment, but after a while, if you browse to the power platform admin center, you’ll see 2 environments, the default one, created by the M365 plan for all users in your tenant, and a developer one, created specifically for your user.\nIf you browse to the Power Apps website and select the Developer environment in the top right corner, you’ll see that this one has a fully deployed Dataverse. On top of that, it grants you some extra perks that you wouldn’t have with just the E5 license.\nAnd that’s it! You can now get started with Microsoft 365 and Power Platform! Let me know if this setup works for you.\n","permalink":"/posts/power-platform-dev-environment/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003ePower Platform is an amazing set of low-code products. Although most people with a work account and a Microsoft 365 license can already make use of a lot of its functionalities, it can be difficult to find a clean environment to learn and experiment with the products. In many organizations, the IT department, rightfully, limits most of these tools.\u003c/p\u003e\n\u003cp\u003eSo what can you do if you do want to experiment, but your company doesn’t allow it or you don’t have a work account? Luckily Microsoft has got you covered.\u003c/p\u003e","title":"Get a FREE Power Platform development environment"},{"content":"Intro I have a love/hate relationship with running. I’ve always enjoyed it because it’s an activity that you can do whenever you want, for however long you want, with minimal equipment or transportation. You just step out the door, and start running. And you burn a lot of calories in the process!\nThe hate part originates from the toll it takes on your body. A bad running form can cause injuries, and my past running experience has thought me that my knees are my weak point. Over the last couple of years, I have started running, got too excited and didn’t give my body enough time to steadily increase the mileage and get used to it. After a couple of months, my running habit went back to zero due to injuries.\nThis is why it’s important to measure your running activities. If you’re too stubborn to listen to your body, or if you haven’t been trained on how to do so, measuring your activity can provide you insight in the strain you’re building up, and how you’re progressing towards your running goals.\nI’m tracking everything on Strava. It’s a beautiful service that is able to track all your fitness activities from multiple devices, and has a nice social network component to it as well. If you’re a premium member, you get access to more insights on your data and performance, but you’re limited to the insights that Strava offers.\nAs an example, the Strava website provides the overview below on weekly distance for the last year. You can select other years or select monthly distance, time or elevation gain, but you can’t select a custom period of your choosing.\nAs a true data-driven IT guy, this is not enough. So I went searching for a way to get my Strava data in Power BI, so that I can create my own insights, metrics and dashboards. Luckily, someone already provided a way to do this.\nStrava Power BI data connector When searching around for existing ways to import Strava activity data in Power BI, there is one article and implementation that consistently kept popping up: Building the Power BI strava data connector by Kasper on BI. I’ll link to that article and GitHub repo:\nArticle GitHub repository The article dates back from 2017 and does not seem to be maintained anymore, but its content is still valuable. I won’t rehash the entire instructions that are explained in the article to get the code, build it, connect it to your Strava Application and its keys. I’ll just highlight a couple of findings I made when trying to fetch my Strava activity data:\nThe project and the required Visual Studio Power Query SDK are not compatible with Visual Studio 2022, but you can use Visual Studio 2019 Community Edition. You need to pay attention to the client_id and client_secret files to not include any spaces in the file when you copy in the keys. If you build the application with spaces, you’ll encounter the following exception when connecting to the connector in Power BI: Microsoft.Mashup.Engine.Interface.ResourceAccessAuthorizationException Power BI versions before the October 2021 release use the Microsoft Internet Explorer 11 embedded control browser for OAuth authentication, which is unsupported by Strava. Therefor make sure that you have a more recent version of Power BI: October 2021 or later. I did notice that trying to authenticate to Strava using a Google account didn’t work, because Google is still complaining about browser incompatibility, but a regular username/password login does work. The Power BI connector connects to the Strava API. This API is documented on the Strava developer site. It calls 2 endpoints:\n/athlete: Fetches the athlete data of the currently logged in user. It contains the name, city, country, sex information of the user, together with its photo and even the type of shoes. It’s general information that can be used to describe the context of the currently logged in user. /athlete/activities: This is the interesting one. It fetches the full list of activities of the user. In my case, this went back all the way to 2011 and pulled in 455 activities! For each activity, you get the type (run, bike, etc.), the distance, the elapsed time, the start time and date and much more. Based on this information you can generate a lot of insights about your activities. So this is a great starting point. You have the raw data available in Power Bi and can start analysis.\nSome insights Over the year 2021, I trained to run a marathon. Due to my weak knees, I needed to steadily build up mileage. A good practice is the 10% rule: every week, you increase your weekly distance by 10%. I tried to adhere to this rule, but let’s see what the data says.\nSo let’s say I started off too ambitious! After 8 weeks, I was running 40 kilometers in a week. After that it highly fluctuated between a high of 61 km and a low of 8.2 km. And the data shows that I overdid it. In the last quarter of the chart, the average dropped due to injury. I was still training for a marathon so overdid it again in week 40, but had to dial it back again the weeks after. So was there really no structure in my training plan? There was…\nIt seems I did not steadily build up the weekly distance, but I did nicely build up my long runs each week. The peaks in the chart show the weekly long run. This is linearly increasing from 5km to 42km.\nAs a last insight, I wanted to see if there is a correlation between the start time in a day and the distance of the activity. I’m mostly a morning runner, and started my long runs in the weekend very early. Let’s see if the data shows this:\nThe average distance of all activities was 9.62 km. The activities in the morning before 7 AM are well above that, with nearly all activities of 20 km or more starting before 7 AM. The highest peak in the chart, at 10 AM, was the marathon race, which started at 10 AM. In the evening, most activities were around 10 km, and the same is valid for runs during the day, which was usually on a workday.\nConclusion The connector provided by Kasper on BI really enabled me to dig deeper into my activity data. It’s not a production ready component but does a perfect job on getting the basic job done. It allowed me to go much further than Strava itself provides, in terms of activity analysis.\nThis is just the discovery phase. I’m experimenting with the connector, with the Strava API and with Power BI to see what is possible. Some ideas that’ll be coming up in the near future:\nThe connector does not cover the entire scope of the Strava API and is only focused on activities. The API itself also offers information about clubs, segments and routes. I want to build on the work of Kasper on BI and extend the connector to cover more information. The activities currently do not contain map information. I’d like to see if it’s possible to use Power BI to visualize a route. ","permalink":"/posts/strava-powerbi/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003eI have a love/hate relationship with running. I’ve always enjoyed it because it’s an activity that you can do whenever you want, for however long you want, with minimal equipment or transportation. You just step out the door, and start running. And you burn a lot of calories in the process!\u003c/p\u003e\n\u003cp\u003eThe hate part originates from the toll it takes on your body. A bad running form can cause injuries, and my past running experience has thought me that my knees are my weak point. Over the last couple of years, I have started running, got too excited and didn’t give my body enough time to steadily increase the mileage and get used to it. After a couple of months, my running habit went back to zero due to injuries.\u003c/p\u003e","title":"Strava: Analyse your fitness using Power BI"},{"content":"When trying to set up my machine to run Azure Functions locally using Visual Studio Code, I ran across an error. After generating an empty HTTP-triggered Azure Function and running it, a big red error popped up in the terminal:\nfunc : File C:\\Users\\Hendr\\AppData\\Roaming\\npm\\func.ps1 cannot be loaded because running scripts is disabled on this system. For more information, see about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170. I happened to try the same thing on a different machine, and got exactly the same behavior. It seems that this is something that you need to resolve yourself on most Windows machines. Browsing to the URL provided in the error message, we get an explanation of what an Execution Policy is exactly:\nPowerShell’s execution policy is a safety feature that controls the conditions under which PowerShell loads configuration files and runs scripts. This feature helps prevent the execution of malicious scripts. On a Windows computer you can set an execution policy for the local computer, for the current user, or for a particular session. You can also use a Group Policy setting to set execution policies for computers and users. https:/go.microsoft.com/fwlink/?LinkID=135170\nSo it looks to be a good thing that it’s blocking scripts by default, so that no malicious scripts are executed without your permission. Of course we need to change this because we do want Visual Studio Code to be able to execute scripts.\nYou can check the Execution Policy on your machine by opening PowerShell in admin mode, and running Get-ExecutionPolicy. On my machine, this returned the value Restricted. When getting the list of all policies, we see that all of them are undefined. When this is the case, the effective policy on a Windows client machine is Restricted. Restricted means that no PowerShell script are allowed to run, but individual commands are allowed.\nSo what value do we need to set in order not to undermine the security benefit that the policy has? Reading the documentation by Microsoft, the RemoteSigned option seems to be a good fit:\nScripts can run. Requires a digital signature from a trusted publisher on scripts and configuration files that are downloaded from the internet which includes email and instant messaging programs. Doesn’t require digital signatures on scripts that are written on the local computer and not downloaded from the internet. Runs scripts that are downloaded from the internet and not signed, if the scripts are unblocked, such as by using the Unblock-File cmdlet. Risks running unsigned scripts from sources other than the internet and signed scripts that could be malicious. This guarantees at least that scripts downloaded from the internet are signed, or that they are manually unblocked. So let’s see if with this policy we can run our Azure Function. To change the policy, run the following command in PowerShell as admin:\nSet-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser This will warn you about the possible implications, which you can accept. Go ahead and confirm the policy change.\nSince we’re changing the policy for the current user of the machine, the change is persisted in the Windows Registry. When we try to run the Azure Function again after these modifications, it runs like a charm!\n","permalink":"/posts/windows-execution-policy/","summary":"\u003cp\u003eWhen trying to set up my machine to run Azure Functions locally using Visual Studio Code, I ran across an error. After generating an empty HTTP-triggered Azure Function and running it, a big red error popped up in the terminal:\u003c/p\u003e\n\u003cpre tabindex=\"0\"\u003e\u003ccode\u003efunc : File C:\\Users\\Hendr\\AppData\\Roaming\\npm\\func.ps1 cannot be loaded\nbecause running scripts is disabled on this system. For more information, \nsee about_Execution_Policies at https:/go.microsoft.com/fwlink/?LinkID=135170.\n\u003c/code\u003e\u003c/pre\u003e\u003cp\u003eI happened to try the same thing on a different machine, and got exactly the same behavior. It seems that this is something that you need to resolve yourself on most Windows machines. Browsing to the URL provided in the error message, we get an explanation of what an Execution Policy is exactly:\u003c/p\u003e","title":"Modifying the Windows Execution Policy on your machine"},{"content":"Intro A big part of my role at Amaris is to explain to our (potential) customers what the value of modern solutions and services can be and how they can get started with them. In an effort to do this for chatbots and virtual assistants, I did a live webinar back in June 2020 to all of the interested customers and employees of Amaris.\nThe webinar starts of with an explanation of what chatbots are and what the potential value for your company might be. After that we dive into the tools and platforms that Microsoft has provided us to easily build a chatbot. Of course, we did a live demo showing all of these tools in action and built a COVID19 (original right?) chatbot that can answer FAQ’s and can look up the number of confirmed cases in a country.\nThe demo focuses on the following technologies:\nMicrosoft Bot Framework: The glue to bring everything together Bot Framework Composer: A graphical user interface to build conversation flows LUIS: The Natural Language Processing (NLP) engine of Microsoft to understand what our end user is saying QnA Maker: A Microsoft Cognitive Service that is able to jumpstart your bot building journey by automatically converting existing FAQ’s to intents for your chatbot. Happy viewing!\nWebinar Useful links Amaris: https://www.amaris.com/ Microsoft Bot Framework: https://dev.botframework.com/ Microsoft LUIS: https://www.luis.ai/ Microsoft QnA Maker: https://www.qnamaker.ai/ CDC Covid19 FAQ: https://www.cdc.gov/coronavirus/2019-ncov/faq.html Covid19 API: https://covid19api.com/ ","permalink":"/posts/chatbot-webinar/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003eA big part of my role at Amaris is to explain to our (potential) customers what the value of modern solutions and services can be and how they can get started with them. In an effort to do this for chatbots and virtual assistants, I did a live webinar back in June 2020 to all of the interested customers and employees of Amaris.\u003c/p\u003e\n\u003cp\u003eThe webinar starts of with an explanation of what chatbots are and what the potential value for your company might be. After that we dive into the tools and platforms that Microsoft has provided us to easily build a chatbot. Of course, we did a live demo showing all of these tools in action and built a COVID19 (original right?) chatbot that can answer FAQ’s and can look up the number of confirmed cases in a country.\u003c/p\u003e","title":"Webinar – Get your chatbot talking to your customers with Microsoft Azure"},{"content":"Intro I have been a subscriber to Office 365 from the moment I started my own company. When I set everything up back then, I of course needed a fitting domain name. This was the domain you’re currently visiting: HendrickxConsulting.com. Microsoft offered to purchase a domain name directly from the Office 365 portal, so with an eye on limiting the number of services I needed to keep track of, this seemed like a good option.\nFast-forward to me setting up my old version of this blog. I decided to go for BlueHost.com, since they are one of the leading and widely recommended providers of hosting for a WordPress blog, and very reasonably priced. When you create a BlueHost account, you can specify wether you want to buy a domain from them or wether you bring your own. I selected to bring my own and filled in this domain name.\nThis means that you won’t be able to manage your domain through BlueHost, since it’s not registered at their side. They recognise the domain, but have no view on its configuration. As a result of this, https://hendrickxconsulting.com was not redirecting to this WordPress site, and the blog was only accessible on a temporary, ugly URL.\nGoogling around on how to properly configure my domain to point to WordPress, I didn’t find any instructions on how to do this. So I needed to figure it out myself and thought that would make for a good short first blogpost.\nConfiguring the Office 365 domain First of all, you shouldn’t change anything on the side of BlueHost. Since they’re not controlling the domain, it’s irrelevant what you see on the Bluehost Domain Configuration screen. Just make sure you check that the domain name is properly listed there, as you can see in the image below. BlueHost Domain Overview shows our domain with unknown details.\nThe only thing you will need to get from your Bluehost configuration portal, is the shared IP address of your WordPress site. You can find this when you log in to BlueHost, and navigate to the Advanced section under General Information. The shared IP 162.241.217.240 is what we’re looking for. So remember this value and head over to your Office 365 Admin site. You can find your Domain Settings under Settings -\u0026gt; Domains. You should be able to see a list of your domains, which in my case is Hendrickxconsulting.com and the onmicrosoft.com domain that Microsoft creates by default for every new Office 365 instance. Once you click on the domain that you want to link to BlueHost, navigate to the DNS Records tab. In there, we need to add 2 records:\nAn A record or Address record: These type of records associate a domain with an IP address. A CNAME record: These are Alias records. They can be used to redirect one domain name to another. We’ll use a CNAME record to redirect the subdomain www.hendrickxconsulting.com to hendrickxconsulting.com. Use the Add record button on the DNS Records tab to add those 2 records using the following values:\nFor the A record, the name should be @. This refers to the domain name without any prefixes, so just HendrickxConsulting.com. The value is set to the IP address which we got from the BlueHost configuration screen.\nIn order to fix that people can surf to this blog using the traditional www prefix, we added the CNAME record. The name was set to www and the value to hendrickxconsulting.com, so that everyone who browses to www.hendrickxconsulting.com is pointed to hendrickxconsulting.com.\nThe TTL or time-to-live value is the amount of time that a nameserver keeps the value cached before it will look for updates. Leaving this at the default of 1 hour is fine.\nThe last step… patience So that’s all that needs to get configured in order to make BlueHost work nicely with your Office 365 domain. The last thing you need to do is wait. It can take some time before DNS changes are reflected properly because all nameservers need to get updated. In addition, BlueHost configures an SSL certificate for your site as well, and although this happens automatically, you’ll need to be patient for this to complete after modifying the DNS settings.\nAll in all I would tell you to sleep on it. Give it a day to work its IT magic and after that you should be good to go!\n","permalink":"/posts/bluehost-0365-domainname/","summary":"\u003ch1 id=\"intro\"\u003eIntro\u003c/h1\u003e\n\u003cp\u003eI have been a subscriber to Office 365 from the moment I started my own company. When I set everything up back then, I of course needed a fitting domain name. This was the domain you’re currently visiting: HendrickxConsulting.com. Microsoft offered to purchase a domain name directly from the Office 365 portal, so with an eye on limiting the number of services I needed to keep track of, this seemed like a good option.\u003c/p\u003e","title":"Configure Bluehost with an Office 365 Domain Name"},{"content":" Gerry\nHello! I am Gerry Hendrickx, a computer science engineer working in IT since 2012 and founder of Hendrickx Consulting. I’m from the Mechelen region in Belgium, working mainly for client in the Brussels area.\nI started of my career as a developer for an international consulting firm. Over the past years, I’ve worked for a number a different clients, ranging from insurance and banking to the public sector. I grew from a developer role into more team leading, project management and architecture, but still prefer to keep close contact with coding. It is my belief that in order to succeed as a project manager and architect, a heavy technical background is a big plus. Additionally, in this ever changing IT landscape, it is important to keep on learning, and remain up to speed with the latest and greatest developments in the field of IT. This can be a challenge, because working full time and combining this with a family life is not an easy job in itself, so the time to spend on learning is sparse. Nevertheless it’s needed, and this I why I decided to set up this blog.\nBlogging for me is not a way to make money, or build a reputation. In first instance, blogging provides me with a way to learn new things and understand them deeper by trying to explain it in a post. If someone finds value in my posts, then I couldn’t be more happy to share.\nThe main topics that I’ll be writing about are:\nSoftware Architecture Artificial Intelligence Cognitive Services Cloud Development I hope you’ll find something useful for you on this blog. I’ll always refer to the documentation/guides that I used to jumpstart my learning, so you’ll be able to follow the same tracks.\nHappy reading, Gerry\n","permalink":"/about/","summary":"About me.","title":"About me"}]