Microsoft Operating Systems and Me

People who know me, know that one of my slogans is “Who needs an Operating System”. Through some mysterious pathway I am currently taking some Operating System exams…. (MD-100 = Windows Client, AZ-800 and AZ-801 = Windows Server) Huh??

Organizations and their way of thinking IT only move so fast (actually real slow). Clinging to the Desktop, be it physical or virtual, and clinging to Servers, be it on-prem or in the Cloud. While the technology to rid of that ancient stuff has been out there for a while now. Who would need a full Desktop OS if all Applications were Web Apps? Why deploy IaaS solutions while PaaS and SaaS solutions are real?

So anyway, apparently, we live in a Hybrid IT world, in this context meaning that we are mixing and mingling traditional deployments with all the Cloud goodies. Which does not make it easier to use these goodies. And there pops up my reason to take another round (#8) at Microsoft Operating systems Certifications. To oblige as Trainer, Consultant and Advisor. Reality sucks , doesn’t it?

Happy Learning!

 

MCT: Delivering AZ-305, notes from the field

This week I did a first-time delivery of MOC AZ-305, Designing Microsoft Azure Infrastructure Solutions, towards Microsoft Certified Azure Architect Expert Certification. The Exam is still in beta, I took it a week before delivering the training. Making the time investment count double, preparing for both my Exam, and delivering the training.

I consider this training and this exam (topics, depth, and breadth) one of the better ones out of the whole Curriculum. The real “Expert level” deal. The 4-day training can be delivered without Labs, the GoDeploy Labs have no direct link with the course content but offer some great deep dives in specific technologies should they fall out of the knowledge scope of the participants. Long Labs, up to 3-4 hours. Each Module ends with one or more Case Studies, plenty of room to discuss various options.

After my preparation I had some concerns for the knowledge level of the participants, it happens too often that people over-estimate themselves in which case they might get lost on day 2 or so. The course really covers a lot of Cloud in just 4 days, the more knowledge the participants already have, the higher the value of this course gets. I would suggest to Microsoft Learning that the prerequisites (also for Certification) should be not just the AZ-104 Azure Administrator. Add 1 or 2 “electives” (as in the old days of MCSE Certifications). Electives can be almost any “Associate level” Certification (AZ-500, AZ-700, SC-300, AZ-204, etc.).

Luckily, the 6 participants in this group selected the right course, they were all very seasoned senior Azure Admins/Engineers and there was plenty of expertise in the (virtual) room on specific technologies like SQL and Kubernetes. But anyhow, take care of the intake of participants for this course.

I started of with a whiteboard session of the “Well Architected Framework”, to give some context on how to approach the content. So, in the discussions on the Case Studies, we could discuss possible solutions based on these principles. For in the Exam as well, references are made to these principles. Maybe a module on this could be added to the course. On day 1 and 2 (and planned for day 3 and 4) I allocated some 2-3 hours for them to work on the Labs. On day 3 however a request was made to not use course time for Labs and rather spend it on discussing the topics and the case studies. The vote on that was unanimous, Wonderful! We ended up in vivid discussions and we all learned a lot form each other. I invited them to look at becoming MCT! And I think some of them will take that path . As we all agreed that the Knowledge and Skills Gaps are a big showstopper for leveraging Cloud technologies. Do something about it instead of complaining about it.

Overall, I am very satisfied with this course and I’m looking forward to delivering it again (scheduled for March).

 

Happy Learning!

Security: get the whole deal

By now we all know Microsoft has become a “Security Company”. Their current portfolio on Security, Compliance and Governance is unmatched. By now most Organizations realize their Security posture is not what it should be. Not to mention their Compliance and Governance posture.

Plenty of Office 365 customers come to me for a Solution for a specific issue they encounter. Ransomware, spoofing, account breaches, compliance requirements, you name it. They perform some searches on the Internet and find an Add-On Subscription to remediate their issue. That is reactive. Out of my experience I know for sure they’ll be back before long with another issue and another Add-On to remediate. Reactive once more.

Let’s stop doing that. Let’s start being pro-active. Digital transformation is nothing more than “loosing old habits and creating new habits”. How do we get there? Not by enumerating factsheets of the capabilities of the products. We get there by showing Business Decision Makers what the threats are from the Business and User perspective. Then we show them how to remediate those threats and what that looks like from the Business and User perspective. Loved by Users, trusted by IT. Pro-active. Let the always present Mr. Murphy die a slow but certain death.

Having these presentations and conversations with customers creates instant transformation: Value is more relevant than Cost. So, we can stop talking about Add-Ons and we can start talking about the complete packages. They bring Value.

Oops, is this a Sales pitch?

Happy protecting!

 

Good Fun with Surface Pro3, Nested Hyper-V, Nano Server and Azure Site Recovery

For a Microsoft event on Jamaica I was asked to deliver a session on Azure DR. I believe that seeing is believing so I always tend to use as little slides as possible and just demo the Solution. On a Saturday morning, I started out thinking and drawing out the Infrastructure I would need to pull that off.

I have a SOHO environment, and for Inova Solutions, my employer, I moved everything to the Microsoft Cloud. My Surface Pro3 (i5, 8 GB RAM) will just have to do as my “local datacenter”.

The steps for the Azure Site Recovery are very well explained here https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-hyper-v-site-to-azure including the screenshots so I will not copy that into this blog. For my “Onprem Datacenter I need a Hyper-V machine, well, Windows 10 has Hyper-V. Download the Azure Site Recovery Provider from the Azure Portal and run it. Bummer.

Apparently, the installer does not see Hyper-V on Windows 10 as Valid to install the ASR Provider. I need a Server 2016 or Server 2012R2 Hyper-V. Lucky for me, both Windows10 Hyper-V and Server 2016 Hyper-V support Nested Hyper-V. To enable Nested Hyper-V you need to download and run a PowerShell script called Enable-NestedVm.ps1 from https://github.com/Microsoft/Virtualization-Documentation/tree/master/hyperv-tools/Nested after which you can test by running Get-NestedVirtStatus.ps1 downloaded from the same repository. In the 1st script I had to confirm that I’d be running the Nested Hyper-V Server with less than 4 GB of memory.

My Nested Hyper-V Server running with 3 GB of memory I need to create some seriously small servers. And, another reason for creating small servers is that I want small disks to replicate. Thanks again Server 2016, we now have Nano Server! Here is a Quick Start Guide for Nano Server https://technet.microsoft.com/en-us/windows-server-docs/get-started/nano-server-quick-start . In no time, I created 2 Nano Servers, Nano1 and Nano 2, each assigned 128 MB of RAM and a 1 GB OS Disk. I successfully installed the ASR Provider and registered the Nested Hyper-V Server with the downloaded Registration file. Within 5 minutes my Server popped up in the Azure Portal and I could select my Nano1 and Nano2 Server to start replicating. And here they go:

And here they are:

I ran a Test Failover just to see if I could complete the whole exercise within 45 minutes, the length of my presentation. The test failover only took about 2 minutes.

Just to demonstrate that with minimal equipment you can demo an Enterprise grade feature like Azure Site Recovery using some neat Technologies. The first build took a bit longer than 45 minutes to figure it all out. Ran the whole thing 3 times now to get comfortable with it doing it on stage, well within 45 minutes J.

 

Happy demoing!!!

Windows Phone Dead because of Lack of Apps – Really?

Yes, I am biased, I am a Microsoftie. That being said, I think it is a pity that people write off Windows Phone, especially if it’s for the wrong, or even non-existent, reason.

My Windows Phone does everything I need it to do and even more. I have my Office365 accounts on it, my Microsoft Band App, my Dash, my music, camera, Twitter, what have you. I’m not here to crunch numbers on how many apps are out there on the different platforms. Although that is the main thing people hold against Windows Phone: there is just not enough apps. Now, I just found a decent article on what we actually DO on our devices using ComScore “The 2015 U.S. Mobile App Report” as a source. The article is in Dutch. Sorry for that, the article came from a Belgian site. But here are some of the numbers.

We spend 89% of our time on a smartphone in apps, 11% goes to the Mobile Web. We spend 80% of our time on only 3 apps! Only 3 apps! With Facebook being nr.1 by far. Followed by YouTube, all flavors of “Chatter and Social” apps and Games. 65% of people hardly download any apps after buying their device. Those main apps come with the device Out-Of-The-Box. We install apps that come with some specific hardware like GoPro camera’s, health devices, smartwatches and we install apps for Banking, Starbucks and Dunkin-Donuts. In the ComScore report there is no app mentioned that does not run on Windows Phone. Funny detail in the article and the report: no mention on usage of the “phone-app”. Time to get rid  of the Phone-App?

So the general opinion “Windows Phone lacks apps” is not consistent with what we actually DO on our devices. Windows Phone is dead, must be the Conspiracy …. What say you?

Nevertheless, I will bring only my Windows Phone to deliver my presentations and demo’s on Azure Disaster Recovery during our events in Bermuda and Belize next week. Hello iOS, hello Android…. anybody out there?

Happy Apping to you all!

Cloud Adoption, where to start: CEO

This is the third blog in a series on Cloud Adoption and Cloud Migration. Previously I wrote “The GAP between Cloud Migration and Cloud Adoption” and “Office 365 and Bandwidth – Adoption to Cloud Computing“. This one is on Ownership of Cloud Adoption and Ownership of Cloud Migration. As explained in previous mentioned posts Adoption and Migration are two totally different things.

IT Departments are responsible for Cloud Migration(s). It’s about the technical challenges of moving workloads to the Cloud. Ownership of Migration lies with the IT Department, somewhat automatically delegated by the Organization. Not much to discuss here.

Now Cloud Adoption, who has Ownership of that? I have seen a lot of Migrations not yielding the expected results, not because it was a bad Migration but because the Organization did not benefit from it, or even worse, continued “business as usual”. Didn’t even have to do with Cloud Migrations; could be Onprem Exchange, SharePoint, Desktop OS or Office migrations as well. A lot of Organizations run the latest versions of those but still live in the dark ages when it comes to using them. Because nobody in the Organization took Ownership of the Adoption. Mostly that was left to IT Managers. But who listens to IT Managers, not the Sales and Marketing Managers for sure. They are busy. And so any free 1996 Pegasus mail server and mail client could actually do the job. IT should not be owner of the Adoption of features made possible by the Migration. It should work the other way around. First there is a Feature Requirement list made by the Business. Out of that an IT Project/Migration may get started.

That being said, Adoption first, leaves the question of who must be the Owner if not IT. The answer to that is very simple: the CEO. If the CEO is not the Owner of Adoption every IT Manager will set himself up for failure when engaging in whatever Migration. Adoption touches the very heart and nature of the way people work and thus the Organization. If that is not endorsed, empowered and owned by the CEO, well, good luck. All will trickle down into the Organization from the highest management making sure all is in place when the stages of Migration arrive. I have very good experiences with Migrating higher Management first. Let them “Walk the Talk” and show that “all is well”.

Also, when progress stops because of CEO’s are not taking Ownership, Shadow-IT becomes a painful reality. Percentages of users finding their own way to do their job are rising, IT loses yet more control as will the Organization itself. Mobile Devices, Tablets, Notebooks, Drop Box, Skype, OneDrive, unmanaged devices, unmanaged storage, where is the corporate content going? That makes any’s Organizations fear of safety in the Cloud a bit ridiculous, doesn’t it?

To set up that CEO’s come back in control IT Management needs to have good connections with the higher Management. As a Consultant I can’t do much if I’m stuck on the level of IT. IT may understand what direction to go or not, but if higher management speaks a different language then IT is also stuck. This morning I had a one hour conversation on this with the IT Manager of one of my customers. He’s stuck in that exact situation. I could only listen to him and coach him on how to repair the damages of the past in those lines of communication in order for his CEO to get aligned again and put his fist on the table to move forward. I asked him to keep me posted on how that will go.

Another Customer sits at the other side of this. His CEO is enrolled in Azure and they are really moving forward FAST now. The CEO knows nothing about Technology but he was informed in such a way he could endorse and empower the Organization to move in that direction.

Conclusion: Cloud Adoption starts at the CEO!

 

CEO’s, Happy Adopting!!! You really should!

Accounts, Identities and mail addresses

Users want to access applications and data that run anywhere, and, they want to run them from anywhere. There is only a very thin line left distinguishing business apps from non-business apps and they all need to be accessible anytime, anyplace, anywhere. That calls for Identity Management which can be very confusing for users. So here is a little explanation on the why, the what and the how.

In the old days we used to logon to our computer using this format:

  • Domain\user
  • Computer\user

Or maybe even without the domain\ or computer\:

  • User (domain user)
  • User (local computer user)

As long as the applications and their data sat on that local computer or in that local Active Directory domain, a logon like this worked perfectly. (Really? No. It uses NetBIOS and that protocol is soooo 1987, but that discussion is out of scope for this article)

The logon identity for a user must now be valid outside of the local computer and the local Active Directory as well. It must identify the user as a unique identity across multiple platforms, preferably: local computer, Active Directory, cloud applications like Office365 and personal applications like Facebook, Gmail, Twitter and the rest of it all.

The format to logon is called User Principal Name and a UPN looks like an email address and that can be very confusing for some users, for example:

NOTE: this is NOT necessarily a mail address! A mail address for this user could be, for example:

If your computer is running Windows 8 (or above) you can logon using a Microsoft Account (a.k.a. LiveID or Hotmail.com or outlook.com account). The format of such an account is always the UPN format and it may or may not correspond to your private email account.

If you use multiple devices like PC, laptop, tablet, Windows Phone, the Microsoft Account synchronizes a lot of settings between your devices (like recent documents, desktop wallpaper, etc.) so that the user experiences a unified work environment, on whatever device.

Next to having a Microsoft Account (private, individual) you can have an Active Directory account to access corporate resources in your corporate network, maybe even remotely. Active Directory can (and should) have UPN as logon format instead of the NetBIOS domain\user.

When using Microsoft Online services like Office365, Microsoft Intune or Microsoft Azure you may have a so called Organizational Account, always in the format of a UPN. It can be synchronized from your Active Directory account, even with synchronization of your password. But beware, unless a thing called Federated Identities is enabled by your administrators, it still is 2 separate identities; you logon to separate authentication providers, your local Active Directory and a Cloud Authentication provider like Azure Active Directory.

So until now, this is all on accounts, the mechanism with which you authenticate yourself. Now we get to email addresses.

An email address always comes in the format of UPN (actually UPN’s were there first and email addresses were derived from the UPN). As noted above, the account does not necessarily have to match an email address. It can but it is not a requirement.

And that is exactly what can make it very confusing for users if they do not distinguish the difference between accounts (UPN, identity) and email addresses. THEY ARE TWO DIFFERENT THINGS! A user can sign with one UPN and have access through that to multiple mail addresses (aliases) even in different domains.

Some organizations and users try to match the account UPN to the email address, making it simple for users: who you are (account UPN) is your mail address. It gets confusing when you have multiple accounts AND multiple email addresses. In order to get it straightened out for yourselves you can create a little table like this and fill out the appropriate UPN’s:

Access to

Microsoft Account

Active Directory Account

Organizational Account

Devices

     

Active Directory

     

Online Services

     

Office ProPlus

     

Work email

     

Private email

     

 

Happy logging on!

 

 

 

 

 

Azure in Plain Words

Microsoft Azure is BOOMING. The platform is getting new features and feature updates on a daily basis and for most people it’s hard to keep up with that. Some may even give up on it, being overwhelmed by the possibilities on Azure. In this blogpost I will try to write in plain words on 8 features that I like most. No technical deep dives, just the functionalities and the benefits of having those functionalities running on Azure. Basically all sits in the cloud, meaning no local resources: Storage, CPU, RAM, and Networking. And you pay per use.

  1. Azure Websites. Whatever website you want to run, a basis (static) public facing company website, a web application, a web shop, Azure offers the platform. This blog sits on Azure as simple Azure Website + WordPress template deployed on it. Takes about 10 minutes to set that up and have it running. For more complex web applications, with stuff running in a backend database, transaction processing we can deploy websites as Cloud Service. And finally, to have total control, we can create Virtual Machines running the web applications. All 3 options are scalable depending on demand and of course all 3 flavors come with their own set of benefits/drawbacks.
  2. Azure Virtual Machines. Within minutes we can spin on a Server and manage it as if it is spinning locally or we can upload Virtual Machines that are already running locally. Scale up the servers (add RAM and CPU) as needed or scale out (add servers) as needed. During the day time 6 servers could be running, at night there will be just one server running. Any Server Role can run on an Azure Virtual Machine. You could start with deploying a test & development environment in Azure. In the template Gallery there are plenty of pre-loaded templates available:
  3. Azure Virtual Networks. For those Virtual Machines to function they need to be placed in a network. We can create Virtual Networks on Azure to connect to those Virtual Machines and to connect the Virtual Machines to whatever they need to connect to. We can extend our local networks to Azure and spin on Servers and Services as needed.
  4. Azure Service Bus. More and more applications tend to communicate with each other. In order to do that there must be some kind of communication infrastructure in place. Through Azure Service Bus applications can talk to each other wherever the application runs. Some applications just talk while others only listen and some may do both, so that those distributed applications can complete the tasks they are set up to do.
  5. Azure Remote App. Azure Remote App became General Available just a week ago and enables organizations to publish just about any application to end users wherever they are. Scalable, resilient, redundant and connected to any backend required. Remote App has been available running on onprem servers for a couple of years, this takes it to the cloud!

  1. Azure Storage. Of course Services we run on Azure require storage. Fast storage, high available storage, cheap storage. One of the use cases for Azure Storage is backup of local servers. We all know that backups should be kept off premises, well, this is a nice way of having that in place.
  2. Azure SQL Database. A lot of applications need a SQL Database to store, query and retrieve data. And sometimes it’s a bit over the top to go to a full-featured SQL Server. Azure SQL Databases offers SQL Database services without the need of deploying a Virtual Server and install SQL Server on it.
  3. Azure Active Directory. Authentication and Authorization mechanisms need to be out there in order to allow users to access the resources in Azure. We can integrate that with our local Active Directory but also with “industry authentication providers” like Google and Facebook through Azure Access Control Services.

Microsoft Azure offers more than mentioned in this blogpost, please visit http://azure.microsoft.com/ for more information.