Ironman Software Forums
Continue the conversion on the Ironman Software forums. Chat with over 1000 users about PowerShell, PowerShell Universal, and PowerShell Pro Tools.
PowerShell Universal has evolved immensely over the years. At first, we exposed REST APIs for all our cmdlets. Each cmdlet would simply call a REST API and return data. Then, we had users that wished to call the cmdlets without having to provision app tokens. This led to cmdlets the could run over our internal gRPC channel as well as directly via our integrated environment.
This led to a lot of complexity. We had to maintain four different ways to call the cmdlets:
As you can imagine, this led to bugs and inconsistencies. In PowerShell Universal v5, we’ve focused on unifying our cmdlets to utilize a single mechanism for remote calls. The configuration system remains the same but only returns instances of our POCO objects so the complexity is minimal.
HTTP and the configuration system were the first ways our cmdlets worked. If the cmdlet was called outside of PSU, it would call a REST API using Flurl. We had a base class that implemented a lot of the boilerplate code but it was still a lot of code to maintain. The fluent APIs were nice but still required work in the module and on the server.
Here’s an example of how the Get-PSUAppToken
cmdlet would return tokens by a specific identity.
public async ValueTask<IEnumerable<AppToken>> GetAppTokensByIdentity(Identity identity)
{
return await _url.AppendPathSegments("api", "v1", "identity", identity.Id.ToString(), "apptoken")
.MaybeWithOAuthBearerToken(_appToken, _useDefaultCredentials)
.GetAsync()
.ReceiveJson<IEnumerable<AppToken>>();
}
We hade some abstraction in place so the cmdlet itself would just call the base class to return the data. The IClient
interface was exposed as a property in the interface that was one of the following:
Client
service class (direct, internal call)HttpClient
class (HTTP call)GrpcClient
class (gRPC call)In some circumstances, we even used the IDatabase
class during configuration to interact directly with the configuration database.
The cmdlets themselves were pretty straightforward but it still required a lot of IClient
implementations to handle the different ways the cmdlets could be called. Here’s of an example of the Get-PSUAppToken
cmdlet.
if (ParameterSetName == "All")
{
WriteObject(Client.GetAppTokens().ToResult(), true);
}
else if (ParameterSetName == "Identity")
{
if (Identity.Id == 0)
{
Identity = Client.GetIdentityByName(Identity.Name).ToResult();
}
}
else
{
WriteObject(Client.GetAppToken(Id).ToResult(), true);
}
On the server side of things, there were also multiple implementations. If you called the cmdlet with -Integrated
or within the integrated environment, it would end up calling the Client
class either over gRPC or directly.
public async ValueTask<Response<Identity>> GetIdentityByName(StringRequest name)
{
await Task.CompletedTask;
return new Response<Identity>(_database.Identities.Where(m => m.Name == name.Value).FirstOrDefault());
}
But, if you called it with an app token, it would run over HTTP and hit the ASP.NET controller. This had an entirely different implementation.
[HttpGet]
[Authorize(Roles = "Administrator,Operator,Reader,Execute")]
[Route("{id:long}/apptoken")]
public IEnumerable<AppToken> GetAppTokens([FromRoute] long id)
{
return _database.AppTokens.Where(x => x.Identity.Id == id && x.Identity.Name != "System").ToArray();
}
Not only was the logic different, but the authentication and authorization mechanisms were different as well.
In v5, we’ve eliminated the HTTP and integrated environment calls. All cmdlets now run over gRPC. We’ve exposed the gRPC services directly via ASP.NET Core’s new hosting mechanism for these types of services. We’ve also built out clients\services for each type of resource, similar to the controllers that we had previously. This means we have a single class that we can call over gRPC, use internally with our service provider, and call over HTTP for our REST API.
The cmdlets are a little different. Rather than use a bloated IClient
interface, each cmdlet retrieves one or more services from a gRPC base cmdlet.
var client = GetService<IAppTokenClient>();
var identityClient = GetService<IIdentityClient>();
When it calls the client service, it does so with methods that appear just as they would if they were calling the service directly.
if (ParameterSetName == "All")
{
WriteObject(client.GetAllAsync().ToResult(), true);
}
else if (ParameterSetName == "Identity")
{
if (Identity.Id == 0)
{
Identity = identityClient.GetAsync(Identity.Name).ToResult();
}
WriteObject(client.GetByIdentityAsync(Identity.Id).ToResult(), true);
}
else
{
WriteObject(client.GetAsync(Id).ToResult(), true);
}
Under the hood, this is typically calling a gRPC service. We use protobuf-net
to generate proxy services on the fly.
var binderConfiguration = BinderConfiguration.Create();
var clientFactory = ClientFactory.Create(binderConfiguration);
var channel = GrpcChannel.ForAddress(computerName, options);
return channel.CreateGrpcService<T>(clientFactory);
We are also dropping down to gRPC Web to support Windows authentication and HTTP 1.1. We’ll be making some tweaks in the future to enable HTTP 2.0 that will increase performance but won’t work in every platform. Using gRPC Web on the server also allows for calling our services from the browser.
var handler = new GrpcWebHandler(GrpcWebMode.GrpcWeb, new HttpClientHandler
{
UseDefaultCredentials = UseDefaultCredentials
})
{
HttpVersion = new Version(1, 1),
};
While the code-per-cmdlet saving were large in the module, the server implementation saving was even greater. We unified our services, controllers and configuration system as a single resource service used for all these purposes.
Each service is now registered with a custom ServiceAttribute
to avoid having to add individual services to the startup.cs
file. Additionally, we use the ControllerAttribute
and TagsAttribute
to expose the service as a REST API. The controller adds it to the REST API and the tags are used for Swagger documentation.
[Service(typeof(IComputerClient), grpc: true)]
[Controller]
[Tags("Computer")]
public class ComputerClient(IDatabase database) : IComputerClient
We have some simple reflection to find and load these services at startup. Services can still define their lifetimes and interfaces. ASP.NET Core does something similar with the ControllerAttribute
and TagsAttribute
so we don’t need to manage that ourselves.
public static void AddPlatformServices(this IServiceCollection serviceCollection)
{
typeof(PlatformServices).Assembly.GetTypes()
.Where(p => p.GetCustomAttributes(typeof(ServiceAttribute), false).Length != 0)
.ToList()
.ForEach(p =>
{
var attribute = (ServiceAttribute)p.GetCustomAttributes(typeof(ServiceAttribute), false).First();
serviceCollection.Add(new ServiceDescriptor(attribute.InterfaceType, p, attribute.Lifetime));
});
}
If a service is defined as a gRPC exposed service, it will also be mapped as a gRPC service.
private static readonly MethodInfo MapGrpcService = typeof(GrpcEndpointRouteBuilderExtensions).GetMethod("MapGrpcService");
public static void AddGrpcServices(this IEndpointRouteBuilder builder)
{
typeof(PlatformServices).Assembly.GetTypes()
.Where(p => p.GetCustomAttributes(typeof(ServiceAttribute), false).Length != 0)
.ToList()
.ForEach(p =>
{
var attribute = (ServiceAttribute)p.GetCustomAttributes(typeof(ServiceAttribute), false).First();
if (attribute.Grpc)
{
MapGrpcService.MakeGenericMethod(p).Invoke(null, [builder]);
}
});
}
Each method within the service is decorated with several attributes to control access to the service, provide an HTTP URL and method and to enhance Swagger documentation. The PermissionAttribute
is used to control access to the service and is a custom AuthorizeAttribute
that we’ve implemented for fined-grained permissions. It is used for both HTTP and gRPC calls. The HttpGetAttribute
is used to expose the service as an HTTP endpoint and provide the route to call the service. The ProducesResponseTypeAttribute
is used to provide Swagger documentation for the service. It’s optional but provides great control over what shows up in Swagger.
The result is a very minimal implementation of the logic but a lot of declarative code that describes the service and control how it should be accessed.
/// <summary>
/// Returns all computers.
/// </summary>
/// <returns></returns>
[Permission(Resource.PlatformComputers, Access.Read)]
[HttpGet("/api/v1/computer")]
[ProducesResponseType(typeof(IEnumerable<Computer>), 200)]
public async ValueTask<IEnumerable<Computer>> GetAllAsync()
{
return await database.Computers.GetAsync();
}
Internally, we can use the services as we would any other service. When using a service in this manner, it doesn’t have to go through an authentication or authorization.
var service = serviceProvider.GetRequiredService<IComputerClient>();
var computers = await service.GetAllAsync();
If a service is used in the configuration system, for resources like Endpoints, it will implement the IModelService<T>
interface. This interface is used to interact with the configuration system. This is a generic CRUD interface that is called when reading and writing configuration files.
[ServiceContract]
public interface IModelService<T> where T : IModel
{
ValueTask<T> CreateAsync(ModelOperationContext<T> context);
ValueTask DeleteAsync(ModelOperationContext<T> context);
ValueTask<T> UpdateAsync(ModelOperationContext<T> context);
ValueTask<T> Get(LongRequest id);
ValueTask<IEnumerable<T>> GetAll();
}
For example, if an endpoint is deleted, the DeleteAsync
method is called. It is responsible for deleting the endpoint from the database, file system and any other resources that are associated with the endpoint.
public async ValueTask DeleteAsync(ModelOperationContext<Endpoint> context)
{
var endpoint = context.Item;
await apiService.RemoveEndpointAsync(endpoint);
await database.DeleteAsync(endpoint);
if (endpoint.Path.IsNotNullOrEmpty())
{
var fullPath = pathService.GetEndpointPath(endpoint);
if (File.Exists(fullPath))
File.Delete(fullPath);
}
if (context.Serialize)
await serializer.SerializeToFileAsync<Endpoint>();
if (endpoint.Documentation != null)
docService.CacheDocs();
}
During startup of PowerShell Universal, the model services are used to perform the intialization of all the resources. The same configuration system is used to read the .ps1
configuration files in the .universal
folder but it now passes the configuration data to the services to create the resources.
We’ve saved thousands of lines of code by changing how our cmdlets and services work. By reducing the code base size, it makes it easier to implement and test services. It also reduces the risk of bugs, inconsistencies, and improves our overall security posture. While we haven’t performed extensive performance testing, we anticipate that the gRPC services will be faster than the HTTP services in HTTP 2.0 mode. The current, default gRPC Web implementation is likely to have similar performance to the previous HTTP implementation as it uses a similar transport.
While not the greatest metric to measure code quality, we’ll be doing a little comparison of the lines of code between version v4 and v5. We anticipate, between the move to Blazor and the unification of services, we’ll have reduced the code base significantly and even added features along the way.
Continue the conversion on the Ironman Software forums. Chat with over 1000 users about PowerShell, PowerShell Universal, and PowerShell Pro Tools.
Receive once-a-month updates about Ironman Software. You'll learn about our product updates and blogs related to PowerShell.