Earlier this year, my boss, Joy Chik, CVP of Identity Engineering shared Microsoft’s guiding principles of our identity and access management (IAM) strategy, emphasizing our commitment to delivering a secure and scalable identity solution. Azure AD safeguards access to your apps by enforcing strong authentication and adaptive risk-based access policies, providing seamless user access with single sign-on (SSO) and reduced IT costs. We envision Azure AD as the key to embracing a Zero Trust security model, enabling secure application access and greater productivity across users, apps, and devices.
Consistently landing in Gartner Magic Quadrant for the past four years tells us that we’re executing on our vision and making a difference for you, our customers.
We’ve learned from your resilience in adapting to remote work over the past year, and your direct feedback has shaped our advancements in several areas:
Adaptive security: Azure AD natively offers comprehensive logging, dashboard, and reporting capabilities, as well as identity analytics with Azure AD Identity Protection.
Secure application access: Azure AD supports out-of-the-box single sign-on (SSO) and provisioning connectors to thousands of SaaS apps, as well as authentication for legacy on-premises applications through App Proxy and secure hybrid-access partnerships.
Report-only mode: The report-only (or audit-only) mode enables administrators to evaluate the impact of Conditional Access policies before enabling them for users.
API access control: We offer built-in centralized policy management, management of security tokens, token translation, and developer self-service support. In addition, Azure AD offers native integration with the Azure API Management service or with third-party API gateway products for more advanced API security.
We’re honored to place this well for the fourth time and believe it reflects the energy and passion we’ve put into partnering with our customers to help them successfully digital transform their businesses. That said, there’s lots more work to do, and we look forward to continuing to partner with you, our customers, to assure the products we build keep your organizations secure and productive. We’re grateful for your trust, and I look forward to seeing what we can accomplish together in the coming year.
To learn more about Microsoft Identity solutions visit our website. Bookmark the Security blog to keep up with our expert coverage on security matters. Also, follow us at @AzureAD and @MSFTSecurity for the latest news and updates on identity and cybersecurity.
This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Microsoft.
Gartner does not endorse any vendor, product, or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
In this unusual year, organizations have doubled down ondigital engagement with their customers and are prioritizing the security and customization oftheir user experiences. We’ve kept this top of mind as we evolve our vision forAzure Active Directory (Azure AD) External Identities, making customization of identity experiences easier than ever.
Today we'reannouncingnew ways you can customize your B2C apps.Once again, we’ve got Partner Group PM ManagerRobin Goldstein on the blog to tell you more.
As always, we hope you’ll try out the new features and share feedback through the Azure forum or by following @AzureAD on Twitter.
At Ignite, we announced a step forward in our Azure Active Directory (Azure AD) External Identities journey with the addition of Conditional Access and Identity Protection to Azure AD B2C, extending Microsoft’s world-class security to help you protect customer and citizen identities.Today,we are excited to announce two more features that make it easier to design secure and seamless customer-facing experiences inAzure AD B2C: API connectors, and phone signup and signin for user flows.
Extendand secure user experiences with API connectors in Azure AD B2C
If you’ve been using Azure AD B2C already, you may be familiar with the ability to use REST API’s in your custom policies. With API connectors for user flows, you can now enjoy similar flexibility using our next-generation preview user flows which are also in public preview.
Azure Portal experience adding an API connector to a user flow in Azure AD B2C
Here are some more great examples of scenarios you can enable with API connectors:
Protect against automated fraud and abuse.
Figure 1. A sign-up experience using the Arkose Labs Platform to protect against automated fraud and abuse.
Use invitation codes
Another way to protect your sign-up experiences is tolimit it to certain audiences. Using API connectors, you can provision invitation codes for specific audiences andrequire users to enter a valid code during sign-up.
Figure 2. A user flow that limits sign-ups to users with an invitation code.
Perform identity verification
Verifying or affirming your user’s identity can also reduce the risk of fraudulent signups by malicious actors. Using API connectors, you can integratesolutions from IDology, Experian, or other providers to verify user information based on user attributes collected at sign-up.
Figure 3. A sign-up flow that collects user information and uses it to verify a user’s identity.
Simplify access with phone sign-up and sign-in user flows
Rounding out our improvements to user flows inAzure AD B2C, you cannow enable users tosign-up and sign-into your app using their phone number (phone-based SUSI).This reduces the need for additional passwords and makes the experience much easier on mobile devices.Like other credentials and identity providers, setting up phone-basedSUSIfor a user flow can be done with just a few clicks. This feature is now being rolled out worldwide.
To get started, you can set up a user flow in the admin portal,using thecombined phone/email sign-up option now under local accounts in the identity providers blade:
End-users will see the option to use their phone numberas well as a link to change their phone number when they get a new phone.
Configure whether to collect a recovery email from users during sign-up or sign-in, to make it easier for users to reset their account.
Admin experience for customizing identity providers settings on a user flow (left) and the resulting end user experience (right).
Admin experience for configuring the recovery email prompt during sign-up and sign in (left) and the resulting end user experience (right).
On behalf of the Azure AD External Identities crew, thank you for your feedback so far. We hope you’ll try out bothpreview features and share more about howyou are customizing your B2C user experiences.
Robin Goldstein(@Robingo_MS) Partner Group PM Manager Microsoft Identity Division
Data is the new oil; the growth of data across the local economy is huge. That being said, figuring out how to best handle data management can be difficult for many organizations. How much data you should keep and how much should be destroyed are vital considerations that aren’t necessarily cut-and-dried.
What are the current requirements/regulatory mandates that customers and businesses need to adhere to? And based on those, what data are you required to keep?
It all comes down to:
1) What do you have to maintain?
2) Why do you have to maintain it?
3) How long do you have to maintain it for?
Those three things are what defines the retention boundaries for any organization.
If you want to protect data in the event that it’s accidentally modified, corrupted, or inaccessible, how can you ensure the business continues after gaining access to that particular item, document, asset, or record?
Many organizations try to say retention and data protection are essentially the same and try to put them in the same bucket. Since they realistically aren’t the same, however, it’s important to have the right strategy in place and leverage technology to address any discrepancies.
What Makes a Data Retention Policy?
The top four areas you should consider when creating a data retention policy are:
1. Compliance with Regulatory Laws
Ensuring that you’re complying with any state or federal laws in terms of destruction of records before an acceptable time frame. This includes being able to manage the risk for any departments that are responsible for meeting any types of privacy laws.
As an organization, you need to be able to define specific rules around when something should be able to be removed or deleted from the business. This is based on specific requirements that go back to compliance. Based on what you’re required to maintain, you can define your destruction rules. Maybe you need to provide some proof of destruction, for instance.
3. How to Make the Business More Efficient
With all this data, how can you make it easy to find everything as quickly and efficiently as possible? You don’t want to retrieve old documents or burden the system and have the end user sift through more than they should. Thus, decluttering the amount of data available to you is key.
4. Cost Savings on Storage
Naturally, there’s a cost associated with keeping all of this data. If you can remove data when possible, you can avoid unnecessary retention fees.
For customers who store data on-premises, the potential cost savings could be tremendous because of the capital expenditures that most on-prem customers still have to plan and manage. Buying additional servers or disks to keep up with all the data that’s being hoarded is serious money
For those using a cloud service that leverages a more operational model where you’re just paying for the services you need, storage is a bit less of a factor. It still matters, but it depends on where that data resides.
At its core, cost savings is all about being efficient with unnecessary storage of content that’s not valuable or required for the organization. This includes physical content too, e.g. file cabinets or traditional hard drives.
While the growth of data is more top of mind today, these concepts of retention and data protection backups have been around for quite a while. As such, many solutions have arisen over the years. These include:
The native capabilities within Microsoft around retention policies,
Things to protect the service level agreement around uptime and access
Being able to ensure data can be retrieved and accessible thanks to proper data protection
Take advantage of what you have outside of the box, figure out where gaps are, and look for solutions to help fill those gaps. Look at what you have to keep, what you want to keep, and what you want to get rid of. Efficiency is at the heart of any great data retention plan.
I have recently been engaged to move a customer from Microsoft Exchange 2010 to Exchange 2016 so they can move to a moderm platform and leverage the features such as cloud deployments, improved reliability, and new architecture that is more in line with their technology roadmap
Before I move on I just want to highlight the features of 2016 in comparsion to 2010.
Exchange 2010 had separate components such as Mailbox, Hub Transport, Unified Messaging, and Client Access for performing separate roles in the server. In 2016, all of these components have been combined into a single component called Mailbox, and this component performs the combined role of other components.
Exchange Admin Center
Exchange Admin Center (EAC) has been greatly enhanced to help you connect from anywhere using a web browser. It acts as a single point of control for all operations and is optimized for on-premise, online, and hybrid Exchange deployments. Due to this enhanced EAC, Exchange Management Console (EMC) of 2010 has taken a back seat. Microsoft observed delayed updates in EMC, and this is why it decided to limit its scope in 2016.
Hybrid Configuration Wizard (HCW)
Exchange 2016 has a cloud-based application called Hybrid Configuration Wizard (HCW) that helps to connect with other Microsoft tools like Office 365 in real-time. Improved diagnostics and troubleshooting make it ideal for hybrid deployments.
MAPI over HTTP
MAPI over HTTP is the default protocol in Exchange 2016, as it is more reliable and stable than the RPC over HTTP protocol of Exchange 2010. Also, this protocol allows Outlook to pause a connection, change networks, and resume hibernation, things that were difficult to implement in Exchange 2010.
In 2010, you had to install certificate for every server through EMC, while in 2016, you can install certificates across multiple servers at the same time through EAC. You can also see the expiry details in EAC.
Now that you know why Exchange 2016 is better, let’s see how to migrate from version 2010 to 2016.
Update the existing environment
If you unsure of the version you’re using, open the Exchange Management Shell and run this command:
This should bring up the current version you’re using. Make sure it says Exchange 2010.
The first step is to update the existing environment to make the 2010 version suitable for upgrading to 2016. To do that, install Exchange 2010 Service Pack 3 and Exchange 2010 SP3 Update Rollup 11. These are the minimum supported patch level updates for 2010, and the installation process is fairly self-explanatory.
The next step is to consider updating the Directory Service Requirement and Outlook Client. For Exchange 2016, the minimum Directory Service Requirement is AD Functional Level 2008, and for Outlook Client, it is Exchange 2016 Support Outlook 2010 and above on Windows and Mac Outlook 2011 and above on Mac. You should update clients to this minimum supported version before implementing Exchange 2016.
Prepare the System for Exchange Server 2016
Do you have the system requirements needed to support Exchange 2016? Let’s double check the below requirements again, as Exchange Server 2016 supports only the following:
Windows Server 2012 / 2012 R2
Minimum memory requirement for Mailbox server role is 8GB plus an additional minimum requirement of 4GB for edge transport
Paging file size should be set to physical RAM, and an additional 10MB to 32788MB, depending on the size of the RAM. If you’re using 32GB of RAM, then go for the maximum of 32788MB
Disk space of at least 30GB on the drive on which you plan to install Exchange. Also, an additional 500MB is needed for every Unified Messaging (UM) language pack that you want to install. Additionally, you need 200MB of available disk space on the system drive, and a hard disk of a minimum of 500MB of free space for message queue database
A screen resolution of 1024 X 768 pixels.
Disk partitions that are formatted on the NTFS file system
.NET framework and UCS API should be installed before installing Exchange 2016. You can download both from Microsoft website and install it in your system.
Make sure your system meets all these prerequisites before installing Exchange 2016.
Next, you have to prepare the schema update. This step is irreversible, so make sure you have a full backup of Active Directory before proceeding.
A good part about this migration is you don’t have to worry much about changing HTTPS names for OWA as both the versions support the same set of naming services and active sync directories.
Install Active Directory for Exchange 2016
Next, run the Exchange 2016 setup. Choose a specific directory to extract all the files of this setup. Once the extraction is complete, run the following commands, one after the other. Open the command prompt and go to the directory where you have extracted the files.
The first command is to prepare the schema, which is, setup.exe /PrepareSchema /IAcceptExchangeServerLicenseTerms
Now your schema is prepared, so move on to the next command, which is, setup.exe /PrepareAD /IAcceptExchangeServerLicenseTerms. Once that’s done, prepare your domain with the command setup.exe /PrepareDomain /IAcceptExchangeServerLicenseTerms. With this, we have completed the Active Directory installation for Exchange 2016.
A restart is required after the roles and features have finished installing. If you’d prefer that the server restarts itself automatically simply append -Restart to the command.
After the restart download and install (in order):
A supported version of .NET Framework. Make sure you check the supportability matrix for more details as there are minimum and maximum supported versions that vary depending on the build of Exchange 2016 you’re installing.
Now that you have the environment set up and Exchane prerequisites are now met we can now install Exchange 2016. Using the installation wizard lets follow the steps through.
Browse through the setup directory, and run the file called Setup.exe.
During the installation, you’ll be prompted to choose the server role selection. Choose “Mailbox role,” and the other options will automatically be deactivated because Mailbox and Edge Transport cannot coexist in the same machine.
Installation will complete within the next few minutes.
Once the installation is complete, click on the Finish button. This will load the Exchange Admin Center on the browser.
Exchange management console in 2010 is replaced with a web-based Exchange Admin Center in 2016. This is the place where you can have greater control over all operations.
After installing Exchange 2016 successfully, update the Service Connection Point for AutoDiscover. To do this, use the Set-ClientAccess command from Exchange Management Shell.
Go to the Exchange Management Shell, and type this command:
Next, update the settings of Outlook Anywhere. To do this, go to EAC, and click on servers on the left hand side. This will open up the list of servers. Click the Edit icon and a pop-up will open. Choose the Outlook Anywhere option, and update the DNS lookup and IMAP4 settings with the name of your new server.
Once you’ve configured the settings, run IIS RESET. To do this, go to your command prompt and run the command iisreset. This will stop and restart IIS services.
The next step is to configure your Receive Connector to relay email applications. To configure this, go to the mail flow option in your EAC, click on a connector, and edit it.
Next up is your Mail Database installation. When you install 2016, a default database is created. You can rename this database and move it from C Drive to another drive. Open the EMC shell and run these commands to rename and move your database.
Once that’s done, update the OWA directory. Exchange 2016 supports acting-as-a-proxy for 2010, so both the versions can coexist using the same URLs. Now, change the OWA and autodiscover URL to Exchange 2016, to ensure all URLs go through Exchange 2016. You can use the below script to do that.
$Server = 'SeverName'
$HTTPS_FQDN = your_URL
Get -OWAVirtualDirectory -Server $Server | Set -OWAVirtualDirectory -ExternalURL $null
Get -ECPVirtualDirectory -Server $Server | Set -ECPVirtualDirectory -ExternalURL $null
Get -OABVirtualDirectory -Server $Server | Set -OABVirtualDirectory -ExternalURL $null
Get -ActiveSyncVirtualDirectory -Server $Server | Set -ActiveSyncVirtualDirectory -ExternalURL $null
Get -WebServicesVirtualDirectory -Server $Server | Set -WebServicesVirtualDirectory -ExternalURL $null
Enable -OutlookAnywhere -Server $Server -ClientAuthenticationMethod Basic -SSLOffloading $False -ExternalHostName $HTTPS_FQDN
Lastly, update the DNS, so it points to autodiscover and OWA. To do that, open your Accu Directory Domain Controller Machine. Open the DNS Manager, and change the record to ensure that it points to the new server.
Test your configuration
Finally, it’s time to test if your configurations work. It’s best to create a new user to login and test the account functionality. To create a new user, open EAC and click on Recipients. From here, add a new user and check if everything is working fine.
If all is good, migrate all users from the Exchange 2010 to the Exchange 2016 database.
In short, much has changed between Exchange 2010 and Exchange 2016, so it’s best you migrate to the latest version to make the most of the new functionalities. Migrating to 2016 is not so difficult when you follow the aforementioned steps.
If you anywhere like me, you will share a pet hate for Windows 10 Bloatware new brand new devices. In the “good old days” you would get an image without the crap installed and that would be it but with Windows Autopilot deployments the bloatware is preinstalled so how do we deal with this challenge today?
First of all, we need a script that will remove the Windows 10 Bloatware, here a script that I have modified to make it a bit smoother for what we are trying to achieve.
Select Windows 10 not macOS then provide the name of the script and a brief description
Under script location browse to the required PowerShell script on your client device.
Understanding this section
Run this script using the logged on credentials: Select Yes to run the script with the user’s credentials on the device. Choose No (default) to run the script in the system context. Many administrators choose Yes. If the script is required to run in the system context, choose No.
Enforce script signature check: Select Yes if the script must be signed by a trusted publisher. Select No (default) if there isn’t a requirement for the script to be signed.
Run script in 64-bit PowerShell host: Select Yes to run the script in a 64-bit PowerShell (PS) host on a 64-bit client architecture. Select No (default) runs the script in a 32-bit PowerShell host.
Specify Tags if you are utilizing them in your environment and once you completed that section, select the groups where you want the scripts applied.
Review your settings and press Add
This script will now apply to your Windows 10 device and remove all the unwanted Windows 10 Bloatware.
or read more about it here. I'm proud to be asked to do this from KEMP. I know there many out there still running TMG Forefront 2010. Sorry to inform you, this is dead, and unsecure and it's time to move forward.
November 24th, 2020 by James "UcMadScientist" Arber
I’ve always said that smaller meetings are better than larger meetings for quite a while. Taking a line from another speaker I use the Two Pizza Rule.
Basically, if the meeting has too many people, or too long for Two Pizza’s to keep everyone fed. It’s too big and should be divvied up into smaller meetings or tasks delegated.
The reason being is that with more people, the longer the meeting runs for everyone to “get their say”. Meanwhile, all the other resources are sitting there doing nothing. Waiting for their opportunity to update 2 or 3 other people in the meeting. Not a great use of resources.
The issue with this today
Before COVID, this self-regulated pretty easily. Staff typically needed a good reason for a meeting, as there was only ever a limited amount of meeting rooms.
But with the explosion of remote working, the need for everyone to connect and effectively “free” meetings. Staff no longer feel the need to justify their business case for a meeting anymore.
A staff member schedules the meeting. Sends the invite to everyone’s email. And attendees turn up because they are expected to.
Users end up getting less work done because meetings are getting larger, longer and more frequent.
For example, I’ve worked on a project where the 2x daily standups would regularly devolve into brainstorming sessions. Including things like troubleshooting, or 2 key attendees discussing a plan between themselves. Meanwhile, the other 6 attendees are sitting there being unproductive or not paying attention. Usually causing them to miss context and ask repeat questions. (Don’t worry, I’m guilty of this too…)
A quick back of the napkin calculation says thats 12 hours a day wasted! More than a whole project resource!
Meetings have their place, but we now have better tools for collaboration.
Typically, your staff organize meetings to get an update on their tasks, how problems are getting solved and discussing potential solutions.
So why not find more effective ways to keep everyone connected? Whilst still allowing users to stay in their workflow and update things in their own time?
Use Teams Planner to assign work and view Status
Teams has supported Tasks in Planner for quite some time now, and its a great way to assign, track and report on all manner of tasks.
Assuming you have a Team Channel for the project in question, create a new Planner Tab and populate it with Tasks.
This posts to the Team about the new tab and lets everyone where to find it.
With a quick glance, everyone in the team can see where tasks are, who they are assigned to and any blockers.
Creating tasks is simple, fill in some details and click Add Task.
Need context? Soon you will even be able to create tasks from messages in the chat.
More complicated than that? just click it and edit the task to have more info
Editing a task is as simple as clicking it, updating the relevant fields and closing it.
These tasks can just be a simple todo list or a full list with deliverable dates with reports. And no special licenses or training for your users to set it up.
Here an example from a real world project.
Did I mention that the Tasks assigned to you appear in your Microsoft ToDo? So you can self organise and update tasks without even opening Teams!
Now your Team can use Planner to get a high-level view of what’s being completed, what’s getting worked on today, and what’s outstanding.
Meeting Averted or Shortened.
Use Threaded Conversations to Keep topics organised, use the pinned post for quick updates.
Another thing we are all used to is email conversations. Teams threads can offer much better live experience without needing to start a meeting to discuss outcomes.
Users may mention something in the meeting, but instead of working it out live then and there. Consider a thread and save valuable time.
Threads let everyone collaborate and update in their own time, without keeping everyone waiting for an update
The trick to this making sure the owner keeps the top post updated with the current status. Letting other people in the team see the latest, without having to read the entire thread.
Kick off a new Conversation and switch to rich format mode
Once that’s done, fill in the top post in the thread with the Topic overview and what the achievements are going to be.
Then underneath, start the organic conversation to get the ball rolling, post links to references even call users in the thread to discuss.
Eventually, the thread will get long enough to be truncated. Just edit the top post to include information for those that just want a quick update without reading the entire thread.
Now everyone can easily see the latest without having to read the entire thread or off-topic posts.
Another Meeting Avertedor Shortened
Keep meetings short, small and infrequent. Things like town halls and the like are fine. But if your having meetings as your go to solution, take a step back and see what you are actually trying to achieve.
Check to see if there is a tool in Teams that can help you achieve your goal like tracking progress more effectively.
With Lists and Project coming to Teams shortly, more and more management can be done without the need for a meeting. Users can work at their own pace, update things without disrupting their flow and be flexible in their remote work.
I also used this comprehensive guide provided by Vlad Catrinescu that includes links to Pluralsight Training courses along with MS Docs articles to every topic covered on the test. I cannot recommend this study guide enough and the Pluralsight courses that go along with it.
Prior to taking this exam, I had been a Teams Admin for a few years. In addition, my company just completed a Teams Only rollout that we've been working on for the past year that included cloud voice, policies, guest access, and many other topics that were covered on the test. As per the usual, actual experience is one of the best teaching tools you can have prior to the taking an exam.
My exam contained 47 questions and I was given 210 mins to complete it. I needed 45 mins to complete it and I scored a 955.