More than a post, this is note to self.
Honestly, over the years it has been so easier to set-up the development environment time and again. I use Azure Virtual Machines as my personal development environment. And of course, this being a personal development environment is ought to be not as clean as you would like it to be and there are chances that you want to recreate it from scratch. Here are the steps which I use to re-create by development environment:
Create Azure Virtual Machine
Of course, you can create a new machine via the portal. I just prefer to create it via the following script:
Change basic settings and Windows Features
- Install IIS
- Disable Loopbackcheck
- Enable PS Remoting
- Disable IE Security Check
- Set Execution Policy for PowerShell
- Install Chocolatey (used later for installation of packages)
Chocolatey comes to picture
Chocolatey NuGet is a Machine Package Manager, somewhat like apt-get, but built with Windows in mind. Lately, I am a lot dependent on this and why not, I don’t have to keep a links of software, packages which I want to install. Chocolatey makes sure every-time I install the packages, I have the latest version and that too with just a single command. Isn’t it cool!
The essentials which I always install are as:
- Visual Studio
- Visual Studio Code
Passing –Yes to the choco command makes sure that it runs in the silent mode.
Of course, this is not the actual configuration of my development environment but this is how I usually start and my environment evolves based on need. Also, I deliberately skipped installation of SQL Server from this post and in scripts as it gives me a reason to make use of Azure SQL Servers.
Needless to say that my love for PowerShell grows every time I make use of it. I had a good discussion with one of my fellow colleagues some time back on whether PowerShell should be considered a scripting language or a programming language. It’s for sure that it is NOT a programming language. Having said that, just to classify it as scripting language (remember VB script?) will not be justice with it’s capabilities.
“PowerShell is a true DevOps language.”
It has the right tools in it’s kitty which not only helps the core developers but also the day-to-day Administrators. And I am still talking about the OLD WORLD where we had those distinctions. Now, let’s come back to 2015 where even Microsoft promotes to deploy right from your Visual Studio.
Anyway, I should come back to the intention of this post before I drift further. It’s about using PowerShell to call RESTful APIs using PowerShell.
Let’s take an example where I want to call a service which sends an email accepting a JSON object with To address, From address, Subject, Body etc. for creating a mail message object. Thanks to Postman, testing your APIs are more easier than ever as you’t have to write the calling part anymore.
Having, my API ready I wanted to schedule this API to called at a certain interval. Yes, I miss those Azure Web Jobs when running on IIS.
However, the good old Task Scheduler is still handy to create a task which calls a PowerShell script. The script looks something like this:
It uses Invoke-RestMethod. The Invoke-RestMethod cmdlet sends HTTP and HTTPS requests to Representational State Transfer (REST) web services that returns richly structured data. However, the additional parameters such as Body, ContentType etc. really make it powerful to use it. If you look at the script, I defined a normal PowerShell Hash Table with a mapping of key and values. And later using ConvertTo-Json to convert hash table as a JSON object.
Visual Studio Online provides a convenient way to access the data and combine it to provide the users with useful information. This is done via a set of REST APIs for Visual Studio Online.
Although, there are several out-of-the-box queries available which can be used to gather information but there are cases when you want to take it to next level to suit your own needs.
Note that these APIs are also available to on premise Team Foundation Server
This post demonstrates an example of accessing REST API using PowerShell to get the details of all change-sets in a particular project collection.
Further, this change-set data is enhanced with information about the WorkItems linked to the change-sets.
Yet another attempt to sort out the pictures I have taken over the years.
I decided to have a simple PowerShell script to arrange by files by Year and Month.
The Year and Month of the file is determined by the “DateTaken” property of the file. If this property is missing, the script falls back to the CreationTime of the file.
P.S.: I am yet to make mind weather or not to use Google Photos.