Sunday, November 12, 2023

Blazor WebAssembly Lazy Loading Changes from .NET 7 to .NET 8

Lazy Loading is an essential tool used in web client development to defer loading of resources until requested by the user, as opposed to loading everything up-front which is expensive. Blazor has had the ability to lazy load Razor Class Libraries for the last several versions of .NET, but there are some updates in .NET 8 that aren't well documented.

To not be completely repetitive, here are the basic steps for implementing lazy loading in your Blazor WebAssembly application direct from the Microsoft docs: Lazy load assemblies in ASP.NET Core Blazor WebAssembly

The issue is the Project Configuration and Router Configuration sections of the docs as of this post are still not up to date. With .NET 8, the WASM assemblies are now built as .wasm files not .dll files and therefore the main update you'll need to make are inside the .csproj file and within the definition of the LazyAssemblyLoader routing code to use the .wasm file extension for the referenced .dlls:

.csproj file updates


Routing file updates


If continuing to use the old code with the .dll extension you would get the following error on building your application:
Unable to find 'EngineAnalyticsWebApp.TestLazy.dll' to be lazy loaded later. Confirm that project or package references are included and the reference is used in the project" error on build
Upon making the required updates to your .NET 8 application to prevent the error above, your app should successfully build, and you'll see the correct deferred execution in the browser.



Friday, November 3, 2023

Dealing with Time Skew and SAS Azure Blob Token Generation: 403 Failed to Authenticate Error

If you're working with the Azure.Storage.Blobs SDK in .NET and generating SAS tokens, you might come across an error like the following when making a call to GetUserDelegationKeyAsync():
Status 403: The request is not authorized to perform this operation using this permission

Here is a sample of code that might throw this error in which the startsOn value is set to the current time:

Before you start double checking all of the user permissions in Azure and going through all the Access Control and Role Assignments in Blob Storage (assuming you do have them configured correctly), this might be a red herring for a different issue with the startsOn value for the call to GetUserDelegationKeyAsync(). The problem is likely with clock skew and differences in times from the server and client.

It is documented in SAS best practices from the Azure docs (found here), the following issue with clock skew and how to remedy the issue:

Be careful with SAS start time. If you set the start time for a SAS to the current time, failures might occur intermittently for the first few minutes. This is due to different machines having slightly different current times (known as clock skew). In general, set the start time to be at least 15 minutes in the past. Or, don't set it at all, which will make it valid immediately in all cases. The same generally applies to expiry time as well--remember that you may observe up to 15 minutes of clock skew in either direction on any request. For clients using a REST version prior to 2012-02-12, the maximum duration for a SAS that does not reference a stored access policy is 1 hour. Any policies that specify a longer term than 1 hour will fail.

There is a simple fix for this as instructed to modify the startsOn value to be either not set or 15 minutes (or greater) in the past. With this updated code, the 403 error is fixed and will proceed as expected.

Wednesday, October 18, 2023

Fixing PowerShell Scripting Error in C# Code: "Exception calling ""ToString"" with ""0"" argument(s). There is no Runspace available to run scripts in this thread."

A bit of a niche post here, but maybe someone is searching the interwebs and this will help save some time. When running a PowerShell script in C# leveraging the System.Management.Automation library, you might run into some neuanced issues that behave differently than when running PowerShell commands directly via the command line. Here is one instance, take the following PowerShell command to install a new Windows Service:
New-Service -Name "MyWindowsService" -BinaryPathName "C:\SomePath\1.0.1\MyWindowsService.exe"
When running this in a PowerShell terminal, it will generate the following output which as documented is an object representing the new service:
Status   Name               DisplayName
------   ----               -----------
Stopped  MyWindowsService   MyWindowsService
This is actually helpful output for a human, and maybe even for logging, bu when running this same command in C# via a PowerShell instance, the command will technically work, but you'll get the following cryptic error:
"The following exception occurred while retrieving the string: ""Exception calling ""ToString"" with ""0"" argument(s): ""There is no Runspace available to run scripts in this thread. You can provide one in the DefaultRunspace property of the System.Management.Automation.Runspaces.Runspace type. The script block you attempted to invoke was: $this.ServiceName""""", System.Object,System.ServiceProcess.ServiceController,System.WeakReference`1[System.Management.Automation.Runspaces.TypeTable], System.Management.Automation.PSObject+AdapterSet,None,System.Management.Automation.DotNetAdapter,"",System.Management.Automation.PSObject+AdapterSet, System.ServiceProcess.ServiceController,System.ServiceProcess.ServiceController,"PSStandardMembers {DefaultDisplayPropertySet}",False,False,False,False,False, "","",None
OK not very helpful. It took a while but I deduced the output from the New-Service cmdlet was causing this issue. There are 2 ways I found to handle this.
  1. Suppress the Output: Since this is running in C# without human intervention, pipe in the option to suppress the output if you are not logging it or using it for some meaningful purpose. Remember, you can still get the status by using Get-Service to inspect the newly installed service, as opposed to reading the output. Use the updated command to accomplish this:
    New-Service -Name "MyWindowsService" 
    -BinaryPathName "C:\SomePath\1.0.1\MyWindowsService.exe" | out-null
    
  2. Return the Output to a Variable: The other option is simply to return the output of the cmdlet typically for use to inspect. This could be done with the following updated statement:
    $newServiceResult = New-Service -Name "MyWindowsService" 
    -BinaryPathName "C:\SomePath\1.0.1\MyWindowsService.exe"
    
    Keep in mind he return value is of type System.ServiceProcess.ServiceController, so you can use that as needed.
For my needs I prefer option #1, as I don't need to inspect the return of New-Service and as mentioned can use Get-Service to further inspect any Windows Service status at this point. With either method though you will no longer get the cryptic error as previously shown.

Thursday, September 21, 2023

How to Disable Either Classic or YAML Pipelines in Azure DevOps (ADO)

If you have pipelines in Azure DevOps that you don't want to trigger automatically either because they are legacy and remain for reference, not currently being used, or just need to temporarily disable, there are different ways to accomplish this based on the type of pipeline.

Classic Pipelines

  1. Select the Classic Pipeline and press the 'Edit' button


  2. Select 'Triggers'


  3. Uncheck the box for 'Enable Continuous Integration'


  4. Save the Pipeline



YAML Pipelines

  1. Select the Classic Pipeline and press the 'Edit' button


  2. Select the ellipsis (3 dots) in the top-right hand corner, and select 'Settings'


  3. Select the 'Disabled' radio button, and press 'Save'


  4. The pipeline is confirmed disabled via the label next to the name




Wednesday, September 20, 2023

Leveraging IntelliSense for Azure DevOps YAML Pipelines in Visual Studio Code

If you're building out YAML pipelines for Azure DevOps (ADO) and don't wish to hand roll them in the online editor provided, you can instead build them in Visual Studio Code. The 1st thing you'll be hunting down though is IntelliSense and auto-complete assistance for your .yml pipeline code. Out-of-the-box, VSC doesn't have this support other than generic YAML language services so you'll want an extension. There is an extension for this and name of it is simply, 'Azure Pipelines' which can be found in the Marketplace


The issue is the Microsoft extension for the ADO pipelines is poorly rated at 2 out of 5 stars which made me wonder if it wasn't worth using. The good news is that rating isn't in reflection for a completely orphaned product as the latest commit was within a week of this writing. The main critique which I verified is after installation it just doesn't seem to do anything. However, with a single-step I got the primary functionality working as desired and have the IntelliSense and auto-complete features I needed.

The key is to manually switch the 'Language Mode' setting in the bottom-right hand corner to the proper value for the specific code file being created (or use the keyboard shortcut Ctrl + K M). Initially it will show 'YAML' which has been auto-detected, and in this mode none of the extension characteristics work. 


Click the dropdown and you'll get a selection of languages that can be set. Select, 'Azure Pipelines.'


Now the .yml file you're working on will have IntelliSense features, auto-complete, and hinting leveraging the installed extension which is helpful in creating pipelines.


Friday, July 28, 2023

Reading Environment Variables, AppSettings, LocalSettings, and User Secrets Seamlessly Across Environments in ASP.NET

One of the biggest challenges we face in modern cloud solutions is making it so sensitive configuration data can be read seamlessly across all environments from the local dev environment all the way to production. This isn't really a new issue, but one that has multiple ways to accomplish some easier than others.


For a sample use case, we might have a database connection string with sensitive information that needs to work seamlessly in code for local, Dev, QA, Stage, Prod, etc. without a lot of special hooks or handling in code. One way to handle this in the cloud regardless of deployment resource or setup is to expose a single key name with unique value per-environment. This can be configured via scripting in deployment pipelines per environment to take care of the DevOps side of this equation. However in code we want this single-read of a key to work in all environments. This is where in .NET the Configuration provider offered in Microsoft.Extensions.Configuration works so well.

Let's say the following environment variable has been configured across Dev, QA, Stage, and Prod environments with respective values already:
CosmosDbConnectionString | AccountEndpoint=https://cosmos-acct-{env}-eus-001.documents.azure.com:443/;AccountKey=abc123...
For local development we have a couple options as well to configure this value:
  1. Create your own environment variable on your local OS with the identical key name and whatever value you choose
  2. Add the identical key and again whatever value you require as a 'User Secret' within Visual Studio or VS Code (VS Code requires an extension download to work with secrets)
The above 2 methods are preferred as they do not add to files that get committed to source control. While you can add the key/value pairs to appSettings.json or launchSettings.json (as env variables), these are not advised as it breaks the requirement that we should not be committing secrets to source control. Note - you could also write a PowerShell script to be added to the solution that developers could run initially to pull environment variable key names from the cloud modifying with a local value for an added bonus to be even more automated.

So moving back to our .NET code, we would like to have a single line that uses 'CosmosDbConnectionString' that will work for all environments.

To get started, in current versions of ASP.NET, this single line of code in program.cs sets us up out of the box:


The absolutely fantastic feature of this black-box is that it will automagically handle the hierarchy of running through multiple different configuration sources including environmental variables! This single line of code (after injecting the IConfiguration configuration service) is all that's needed:


Here is the official documentation on this from Microsoft:
Using the default configuration, the EnvironmentVariablesConfigurationProvider loads configuration from environment variable key-value pairs after reading appsettings.json, appsettings.Environment.json, and Secret manager. Therefore, key values read from the environment override values read from appsettings.json, appsettings.Environment.json, and Secret manager.
This is great, because now we can add our connection string as a local secret (or local environment variable) and have it read at runtime during debugging as required, but once deployed the configured environment variables per-environment will supersede anything from appSettings, secrets, etc. and continue to work with the same code.


This is also the case for non-sensitive data stored in appSettings.json locally vs configuration in a deployed environment like Azure. Any configuration settings in Azure per-environment using the same key name will supersede locally configured values. This use case is more well-known but worth mentioning as it's an anti-pattern to have different environment configs managed at the source control level; it should be managed in deployed environments via DevOps.

A last point to review is naming conventions when creating environment variables so they work platform agnostic. Per the same MSFT docs linked above:
The : delimiter doesn't work with environment variable hierarchical keys on all platforms. For example, the : delimiter is not supported by Bash. The double underscore (__), which is supported on all platforms, automatically replaces any : delimiters in environment variables.
Therefore if you need a hierarchical key consider using the following naming structure which will work on Windows and Linux:
CosmosDb__ConnectionString
This can be read regardless of deployed environment in code with the following line:


Thanks to the .NET configuration provider having all of this included functionality, we can work across 1..n environments using a multitude of configuration options, yet being able to garner that data at runtime with a single line of code!

Thursday, June 8, 2023

How to Reorder the Profiles in the Windows Command Prompt / Terminal

The Windows Command Prompt / Terminal tool has gotten more robust over the years, and makes using a tool like ConEmu (that I used for years), less of a pressing need. That's because the tabbed profiles are available, and makes selecting new tabs for different profile types (Bash, PowerShell, Azure Cloud Shell) a breeze. However there isn't a configured setting option for reordering for those of us that like to organize this sort of thing (i.e. making the ones I use the most at the top of the list). There is a feature enhancement on GitHub (Profile reordering for Settings UI), but for now there's a trivial way to go ahead and modify it today. Here's a picture of the profiles I have setup currently:
I want to reorder these, and the easiest way to do this is to open the 'Settings' from the dropdown of profiles above and select, 'Open JSON File' from the bottom left-hand corner of the dialog:
Now it's simply an act of rearranging the index in the array collection within the settings file. The index location in the array will dicate the order of the profiles in the command prompt window:
Once the changes are complete, save the file, restart the command prompt, and you'll see the new order of the profiles!

Monday, February 13, 2023

The State of JavaScript and Modern Web Client Development

Here is a link to a white paper I wrote named, 'The State of JavaScript and Modern Web Client Development.' If you need help navigating the waters at a high-level in the modern web landscape, make sure to check out the article.