This is third post in my "Push Notifications and ASP.NET Core" series. Previously I've written about Push API and Web Push Protocol. Since the last post I've have extracted the push service client to separate project and added few more features to it (for example capability of VAPID tokens caching). In this post I wanted to show two of those features, which are not so well-known capabilities of push messages: replacing and urgency.

Replacing Messages

There is a way to correlate push messages send to the same push service. In order to do this a topic property needs to be added to the push message.

public class PushMessage
{
    ...

    public string Topic { get; set; }

    ...
}

The topic should be a string with maximum length of 32 and should contain only characters from "URL and Filename safe" Base 64 alphabet. It can be delivered to the push service by using Topic request header.

public class PushServiceClient
{
    ...

    private const string TOPIC_HEADER_NAME = "Topic";

    ...

    private static HttpRequestMessage SetTopic(HttpRequestMessage pushMessageDeliveryRequest,
        PushMessage message)
    {
        if (!String.IsNullOrWhiteSpace(message.Topic))
        {
            pushMessageDeliveryRequest.Headers.Add(TOPIC_HEADER_NAME, message.Topic);
        }

        return pushMessageDeliveryRequest;
    }

    ...
}

When push service receives a message with topic, it goes through all not delivered messages for the related subscription and checks if there is one with identical topic. If such message exists, it will be replaced (which means replacing content and attributes like Time-To-Live). So, if a client has been offline it will receive only the latest version of the message (the client which has been online the whole time will receive all versions). This way delivering unnecessary (outdated) messages can be avoided.

Urgency

Another message property which impacts how push service delivers messages is urgency.

public class PushMessage
{
    ...

    public PushMessageUrgency Urgency { get; set; }

    ...

    public PushMessage(string content)
    {
        ...
        Urgency = PushMessageUrgency.Normal;
    }
}

Urgency serves as a filter. A client can let the push service know what is the lowest urgency of the messages it wants to receive. The typical scenario here is limiting resources consumption. That's why the four currently defined levels have suggested relation to the power and network state of the device:

  • very-low - On power and Wi-Fi
  • low - On power or Wi-Fi
  • normal - On neither power nor Wi-Fi
  • high - Low battery

Like the topic, the urgency can be delivered by using dedicated request header. The name of the header is also easy to guess.

public class PushServiceClient
{
    ...

    private const string URGENCY_HEADER_NAME = "Urgency";

    ...

    private static readonly Dictionary<PushMessageUrgency, string> _urgencyHeaderValues =
    new Dictionary<PushMessageUrgency, string>
    {
        { PushMessageUrgency.VeryLow, "very-low" },
        { PushMessageUrgency.Low, "low" },
        { PushMessageUrgency.High, "high" }
    };

    ...

    private static HttpRequestMessage SetUrgency(HttpRequestMessage pushMessageDeliveryRequest,
        PushMessage message)
    {
        switch (message.Urgency)
        {
            case PushMessageUrgency.Normal:
                break;
            case PushMessageUrgency.VeryLow:
            case PushMessageUrgency.Low:
            case PushMessageUrgency.High:
                pushMessageDeliveryRequest.Headers.Add(URGENCY_HEADER_NAME,
                    _urgencyHeaderValues[message.Urgency]);
                break;
            default:
                throw new NotSupportedException(
                    $"Not supported value has been provided for {nameof(PushMessageUrgency)}."
                );
        }

        return pushMessageDeliveryRequest;
    }

    ...
}

A push message which delivery has been requested without Urgency header is considered to have urgency level of normal.

This is probably the last (at least for now) of my posts about push notifications. I've updated the demo project with support for features described here (and there is still couple things on the issues list to come in future).

This is my second post about Push Notifications. In previous one I've focused on general flow and Push API. This time I'm going to focus on requesting push message delivery.

In simple words requesting push message delivery is performed by sending a POST request to the subscription endpoint. Of course the devil is in details, which in this case spread across four different RFCs.

Preparing push message delivery request

We already know that we should perform a POST a request and we know the URL. If you have read previous post you also know that we will need to use VAPID for authentication and encrypt the message payload. But that's not all. The Web Push Protocol specifies one required attribute: Time-To-Live. The purpose of this attribute is to inform the push service for how long it should retain the message (zero is acceptable value and means that push service is allowed to remove message immediately after delivery). Taking this attribute into account the push message can be represented by following class.

public class PushMessage
{
    private int? _timeToLive;

    public string Content { get; set; }

    public int? TimeToLive
    {
        get { return _timeToLive; }

        set
        {
            if (value.HasValue && (value.Value < 0))
            {
                throw new ArgumentOutOfRangeException(nameof(TimeToLive),
                    "The TTL must be a non-negative integer");
            }

            _timeToLive = value;
        }
    }

    public PushMessage(string content)
    {
        Content = content;
    }
}

The Time-To-Live attribute should be delivered via TTL header, which brings us to following initial code for preparing the request.

public class PushServiceClient
{
    private const string TTL_HEADER_NAME = "TTL";
    private const int DEFAULT_TIME_TO_LIVE = 2419200;

    ...

    private HttpRequestMessage PreparePushMessageDeliveryRequest(PushSubscription subscription,
        PushMessage message)
    {
        HttpRequestMessage pushMessageDeliveryRequest =
            new HttpRequestMessage(HttpMethod.Post, subscription.Endpoint)
        {
            Headers =
            {
                {
                    TTL_HEADER_NAME,
                    (message.TimeToLive ?? DEFAULT_TIME_TO_LIVE).ToString(CultureInfo.InvariantCulture)
                }
            }
        };

        return pushMessageDeliveryRequest;
    }
}

If we would try to send this request, it would result in 400 or 403 (depending on push service) telling that we are not authorized for requesting push messages delivery. It's time to take a look at how VAPID works.

Authentication

The VAPID specification is using JSON Web Tokens. In order to authenticate with the push service the application is supposed to sign the token with Application Server Private Key and include it in the request. The final form of JWT included in request should be as follows.

<Base64 encoded JWT header JSON>.<Base64 encoded JWT body JSON>.<Base64 encoded signature>

One of easiest ways of representing JWT header and body in C# is through Dictionary<TKey, TValue>. The header in case of VAPID is constant.

private static readonly Dictionary<string, string> _jwtHeader = new Dictionary<string, string>
{
    { "typ", "JWT" },
    { "alg", "ES256" }
};

The JWT body should contain following claims:

  • Audience (aud) - The origin of the push resource (this binds token to a specific push service).
  • Expiry (exp) - The time after which the token expires. The maximum is 24 hours but typically half of that is used. The value should be expiration moment expressed as a Unix epoch time.

Additionally application may include Subject (sub) claim which should contain a contact information for the application server (as mailto: or https: URI).

The signature should be a JSON Web Signature using ECDSA ES256 algorithm.

Now to put this all into code.

public class VapidAuthentication
{
    private string _subject;
    private string _publicKey;
    private string _privateKey;

    private static readonly DateTime _unixEpoch = new DateTime(1970, 1, 1, 0, 0, 0);
    private static readonly Dictionary<string, string> _jwtHeader = ...;

    ...

    private string GetToken(string audience)
    {
        // Audience validation removed for brevity
        ...

        Dictionary<string, object> jwtBody = GetJwtBody(audience);

        return GenerateJwtToken(_jwtHeader, jwtBody);
    }

    private Dictionary<string, object> GetJwtBody(string audience)
    {
        Dictionary<string, object> jwtBody = new Dictionary<string, object>
        {
            { "aud", audience },
            { "exp", GetAbsoluteExpiration() }
        };

        if (_subject != null)
        {
            jwtBody.Add("sub", _subject);
        }

        return jwtBody;
    }

    private static long GetAbsoluteExpiration()
    {
        TimeSpan unixEpochOffset = DateTime.UtcNow - _unixEpoch;

        return (long)unixEpochOffset.TotalSeconds + 43200;
    }

    private string GenerateJwtToken(Dictionary<string, string> jwtHeader, Dictionary<string, object> jwtBody)
    {
        string jwtInput = UrlBase64Converter.ToUrlBase64String(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(jwtHeader)))
            + "."
            + UrlBase64Converter.ToUrlBase64String(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(jwtBody)));

        // Signature generation removed for brevity
        ...

        return jwtInput + "." + UrlBase64Converter.ToUrlBase64String(jwtSignature);
    }
}

The above code doesn't contain the signature generation part as it wouldn't be readable, please take a look at it here. The implementation uses ECDsaSigner class from BouncyCastle project and some byte array padding routines. This cryptography can be a little expensive (taking into consideration the possible number of subscriptions) so it's important to remember that the JWT can be cached per Audience with absolute expiration corresponding to Expiry claim.

Currently there are two ways of including the JWT in the request. One is the WebPush authentication scheme and the other is vapid authentication scheme. The vapid authentication scheme is the one from final specification while WebPush comes from draft version. The vapid scheme is very simple as it uses only Authorization header.

Authorization: vapid t=<JWT>, k=<Base64 encoded Application Server Public Key>

So the value can be generated as easily as in below snippet.

public class VapidAuthentication
{
    ...

    public string GetVapidSchemeAuthenticationHeaderValueParameter(string audience)
    {
        return String.Format("t={0}, k={1}", GetToken(audience), _publicKey);
    }

    ...
}

Unfortunately not all push services support the latest specification (at moment of writing this I had no success with using vapid scheme with Chrome). The WebPush scheme seems to be still support even by push services which already support vapid so I'm going to use it here. The WebPush scheme is a little more complicated as it transfer the needed information by using two separated headers.

Authorization: WebPush <JWT>
Crypto-Key: p256ecdsa=<Base64 encoded Application Server Public Key>

This means that both values needs to be exposed separately.

public class VapidAuthentication
{
    public readonly struct WebPushSchemeHeadersValues
    {
        public string AuthenticationHeaderValueParameter { get; }

        public string CryptoKeyHeaderValue { get; }

        public WebPushSchemeHeadersValues(string authenticationHeaderValueParameter,
            string cryptoKeyHeaderValue) : this()
        {
            AuthenticationHeaderValueParameter = authenticationHeaderValueParameter;
            CryptoKeyHeaderValue = cryptoKeyHeaderValue;
        }
    }

    ...

    public WebPushSchemeHeadersValues GetWebPushSchemeHeadersValues(string audience)
    {
        return new WebPushSchemeHeadersValues(GetToken(audience), "p256ecdsa=" + _publicKey);
    }

    ...
}

The authentication can now be plugged into the request preparation code.

public class PushServiceClient
{
    ...

    private const string WEBPUSH_AUTHENTICATION_SCHEME = "WebPush";
    private const string CRYPTO_KEY_HEADER_NAME = "Crypto-Key";

    ...

    private HttpRequestMessage PreparePushMessageDeliveryRequest(PushSubscription subscription,
        PushMessage message, VapidAuthentication authentication)
    {
        // Authentication validation removed for brevity
        ...

        HttpRequestMessage pushMessageDeliveryRequest = ...;
        pushMessageDeliveryRequest = SetAuthentication(pushMessageDeliveryRequest,
            subscription, authentication);

        return pushMessageDeliveryRequest;
    }

    private static HttpRequestMessage SetAuthentication(HttpRequestMessage pushMessageDeliveryRequest,
        PushSubscription subscription, VapidAuthentication authentication)
    {
        Uri endpointUri = new Uri(subscription.Endpoint);
        string audience = endpointUri.Scheme + @"://" + endpointUri.Host;

        VapidAuthentication.WebPushSchemeHeadersValues webPushSchemeHeadersValues =
            authentication.GetWebPushSchemeHeadersValues(audience);

        pushMessageDeliveryRequest.Headers.Authorization = new AuthenticationHeaderValue(
            WEBPUSH_AUTHENTICATION_SCHEME,webPushSchemeHeadersValues.AuthenticationHeaderValueParameter);

        pushMessageDeliveryRequest.Headers.Add(CRYPTO_KEY_HEADER_NAME,
            webPushSchemeHeadersValues.CryptoKeyHeaderValue);

        return pushMessageDeliveryRequest;
    }
}

This is a request which can be send as payload is optional, but it would be nice to be able to have it.

Payload encryption

For privacy purposes the payload of push message must be encrypted. The Web Push Encryption specification depends on Encrypted Content-Encoding for HTTP, which I've been writing about in the past. Thanks to that I already have a ready to use implementation, the tricky part is generating the input keying material.

When a subscription is being created on client side, the client generates a new P-256 key pair and an authentication secret (a hard-to-guess random value). The public key from that key pair and authentication secret are shared with the application server. Whenever application wants to send a push message it should generate a new EDCH key pair on the P-256 curve. The public key from this pair should be used as the keying material identificator for aes128gcm while private key should be used together with client public key to generate EDCH agreement (called shared secret). The client is capable of generating same EDCH agreement based on his private key and application public key. In order to increase security the shared secret is combined with authentication secret by calculating two HMAC SHA-256 hashes. First is a hash of shared secret with authentication secret and the result is used to hash info parameter which is defined as follows:

"WebPush: info" || 0x00 || Client Public Key || Application Public Key || 0x01

The result is truncated to 32 bytes and used as keying material for aes128gcm. Using BouncyCastle allows for quite clean implementation.

public class PushServiceClient
{
    ...

    private static readonly byte[] _keyingMaterialInfoParameterPrefix =
        Encoding.ASCII.GetBytes("WebPush: info");

    ...

    private static byte[] GetKeyingMaterial(PushSubscription subscription,
        AsymmetricKeyParameter applicationServerPrivateKey, byte[] applicationServerPublicKey)
    {
        IBasicAgreement ecdhAgreement = AgreementUtilities.GetBasicAgreement("ECDH");
        ecdhAgreement.Init(applicationServerPrivateKey);

        byte[] userAgentPublicKey = UrlBase64Converter.FromUrlBase64String(subscription.Keys["p256dh"]);
        byte[] authenticationSecret = UrlBase64Converter.FromUrlBase64String(subscription.Keys["auth"]);
        byte[] sharedSecret = ecdhAgreement.CalculateAgreement(
            ECKeyHelper.GetECPublicKeyParameters(userAgentPublicKey)).ToByteArrayUnsigned();
        byte[] sharedSecretHash = HmacSha256(authenticationSecret, sharedSecret);
        byte[] infoParameter = GetKeyingMaterialInfoParameter(userAgentPublicKey,
            applicationServerPublicKey);

        byte[] keyingMaterial = HmacSha256(sharedSecretHash, infoParameter);
        Array.Resize(ref keyingMaterial, 32);

        return keyingMaterial;
    }

    private static byte[] GetKeyingMaterialInfoParameter(byte[] userAgentPublicKey,
        byte[] applicationServerPublicKey)
    {
        // "WebPush: info" || 0x00 || ua_public || as_public || 0x01
        byte[] infoParameter = new byte[_keyingMaterialInfoParameterPrefix.Length
            + userAgentPublicKey.Length + applicationServerPublicKey.Length + 2];

        Array.Copy(_keyingMaterialInfoParameterPrefix, infoParameter,
            _keyingMaterialInfoParameterPrefix.Length);

        int infoParameterIndex = _keyingMaterialInfoParameterPrefix.Length + 1;

        Array.Copy(userAgentPublicKey, 0, infoParameter, infoParameterIndex,
            userAgentPublicKey.Length);

        infoParameterIndex += userAgentPublicKey.Length;

        Array.Copy(applicationServerPublicKey, 0, infoParameter, infoParameterIndex,
            applicationServerPublicKey.Length);

        infoParameter[infoParameter.Length - 1] = 1;

        return infoParameter;
    }

    private static byte[] HmacSha256(byte[] key, byte[] value)
    {
        byte[] hash = null;

        using (HMACSHA256 hasher = new HMACSHA256(key))
        {
            hash = hasher.ComputeHash(value);
        }

        return hash;
    }
}

This enables adding content to the push message.

public class PushServiceClient
{
    ...

    private HttpRequestMessage PreparePushMessageDeliveryRequest(PushSubscription subscription,
        PushMessage message, VapidAuthentication authentication)
    {
        ...

        HttpRequestMessage pushMessageDeliveryRequest = ...;
        pushMessageDeliveryRequest = SetAuthentication(pushMessageDeliveryRequest,
            subscription, authentication);
        pushMessageDeliveryRequest = SetContent(pushMessageDeliveryRequest, subscription, message);

        return pushMessageDeliveryRequest;
    }

    ...

    private static HttpRequestMessage SetContent(HttpRequestMessage pushMessageDeliveryRequest,
        PushSubscription subscription, PushMessage message)
    {
        if (String.IsNullOrEmpty(message.Content))
        {
            pushMessageDeliveryRequest.Content = null;
        }
        else
        {
            AsymmetricCipherKeyPair applicationServerKeys = ECKeyHelper.GenerateAsymmetricCipherKeyPair();
            byte[] applicationServerPublicKey =
                ((ECPublicKeyParameters)applicationServerKeys.Public).Q.GetEncoded(false);

            pushMessageDeliveryRequest.Content = new Aes128GcmEncodedContent(
                new StringContent(message.Content, Encoding.UTF8),
                GetKeyingMaterial(subscription, applicationServerKeys.Private, applicationServerPublicKey),
                applicationServerPublicKey,
                4096
            );
        }

        return pushMessageDeliveryRequest;
    }

    ...
}

Done with all the encryption! The request can now be send.

public class PushServiceClient
{
    ...

    private readonly HttpClient _httpClient = new HttpClient();

    ...

    public async Task RequestPushMessageDeliveryAsync(PushSubscription subscription, PushMessage message,
        VapidAuthentication authentication)
    {
        HttpRequestMessage pushMessageDeliveryRequest = PreparePushMessageDeliveryRequest(subscription,
            message, authentication);

        HttpResponseMessage pushMessageDeliveryRequestResponse =
            await _httpClient.SendAsync(pushMessageDeliveryRequest);

        // TODO: HandlePushMessageDeliveryRequestResponse(pushMessageDeliveryRequestResponse);
    }

    ...
}

The last thing that remains is handling response from the push service.

Handling response

There is a variety of erroneous response codes we can receive from the push service as those aren't standardized. The only two which specification mention openly are 400 and 403, but even those two aren't used consistently by implementations. The only thing we can be sure about is status code indicating success, which is 201 Created. In all other cases best that can be done is probably throwing an exception.

public class PushServiceClient
{
    private static void HandlePushMessageDeliveryRequestResponse(
        HttpResponseMessage pushMessageDeliveryRequestResponse)
    {
        if (pushMessageDeliveryRequestResponse.StatusCode != HttpStatusCode.Created)
        {
            throw new PushServiceClientException(pushMessageDeliveryRequestResponse.ReasonPhrase,
                pushMessageDeliveryRequestResponse.StatusCode);
        }
    }
}

There is one more information which can be retrieved from the successful response - the Location header contains URI of created message. This is it for requesting push message delivery.

I encourage you to play with the demo application. It contains everything described here and I'm planning new things to come soon (for example JWT caching).

Probably all of you have encountered Push Notifications. A lot of portals are bombarding us with requests to allow notifications as soon as we visit them. Despite this abuse, when used in responsible way, Push Notifications can be very useful. They key advantage is that web application doesn't have to check if the user is online or not, it can simply request delivery of push message and user will receive it as soon as possible. Of course this capability is not for free and I will try to show where the cost is hiding.

In this post I'm going to show how Push Notifications can be used from ASP.NET Core web application, although most of the information (and the client side code) are cross-platform. This post focuses on Push API and general flow, there will be a follow up post which will take a deep dive into sending push message from .NET based backend.

If you would like to see how it works before reading (or to look at final code while reading) the demo application is available here.

Prerequisites

It's important to understand that there is a third party in web push protocol flow: push service. Push service acts as intermediary which ensures reliable and efficient delivery of push messages to the client.

Web Push Protocol Flow

The presence of push service rises security and privacy concerns. One of such concerns is authentication. Each subscription to push service has its own unique URL which is a capability URL. This means that if such URL would leak, other parties would be able to send a push message to related subscription. This is why an additional mechanism has been introduced to limit the potential senders. This mechanism is Voluntary Application Server Identification (VAPID), details of which I'm going to describe in second post. What is important now is that VAPID requires Application Server Keys (public and private key pair). The easiest way to generate those keys is to grab one of web-push-libs (those are some sample Push Notifications libraries which are not always implementing the latest standards but are a good starting material). All of them have some kind of VAPID helper which exposes a method for generating keys. The public key has to be delivered to the client. In this post I will put it directly into snippets but in real life I would suggest delivering it on demand (the demo application is doing exactly that), preferably over HTTPS.

Service Worker

The client side components of Push API specification rely on Service Worker specification. More precisely they extend ServiceWorkerRegistration interface with pushManager attribute, which exposes PushManager interface. Service workers are beyond the scope of this post, so I will just quickly show how to register one.

let pushServiceWorkerRegistration;

function registerPushServiceWorker() {
    navigator.serviceWorker.register('/scripts/service-workers/push-service-worker.js',
        { scope: '/scripts/service-workers/push-service-worker/' })
        .then(function (serviceWorkerRegistration) {
            pushServiceWorkerRegistration = serviceWorkerRegistration;

            ...

            console.log('Push Service Worker has been registered successfully');
        }).catch(function (error) {
            console.log('Push Service Worker registration has failed: ' + error);
        });
};

The first parameter of the register method is the path to the script which will be registered as service worker. The register method returns a promise, when it resolves successfully the created ServiceWorkerRegistration should be stored for later usage.

Subscribing

Before showing how to subscribe let me make one remark about when to subscribe. Please avoid attempting to subscribe on load, instead of that give your users a nicely visible button or something else which they can use to subscribe when they wish too.

In order to subscribe for push messages the subscribe method of PushManager interface should be called. The push messages receiver will be the service worker to which the PushManager interface belongs. Two things should be passed to the subscribe method. One is previously mentioned application server public key. The second is userVisibility flag with true value. The userVisibility flag indicates that a notification will be shown every time a push message arrives. If the subscription is created (the user has provided permission for notifications and the push service has responded correctly) it should be distributed to the application server as depicted in diagram above.

function subscribeForPushNotifications() {
    let applicationServerPublicKey = urlB64ToUint8Array('<Public Key in Base64 Format>');

    pushServiceWorkerRegistration.pushManager.subscribe({
        userVisibleOnly: true,
        applicationServerKey: applicationServerPublicKey
    }).then(function (pushSubscription) {
        fetch('push-notifications-api/subscriptions', {
            method: 'POST',
            headers: { 'Content-Type': 'application/json' },
            body: JSON.stringify(pushSubscription)
        }).then(function (response) {
            if (response.ok) {
                console.log('Successfully subscribed for Push Notifications');
            } else {
                console.log('Failed to store the Push Notifications subscription on server');
            }
        }).catch(function (error) {
            console.log('Failed to store the Push Notifications subscription on server: ' + error);
        });

        ...
    }).catch(function (error) {
        if (Notification.permission === 'denied') {
            ...
        } else {
            console.log('Failed to subscribe for Push Notifications: ' + error);
        }
    });
};

The request for distributing the subscription is a standard AJAX request. This gives a chance to provide any additional information to the application server (cookies identifying the user, additional attributes in payload etc.). If it comes to the subscription itself, there are two key attributes which must be stored. First is the endpoint attribute (it contains previously mentioned capability URL) and second is keys. On server side it can be represented with simple class.

public class PushSubscription
{
    public string Endpoint { get; set; }

    public IDictionary<string, string> Keys { get; set; }
}

The Keys property is a dictionary which is used to share any required push message encryption keys. This is how the privacy of push messages is achieved. It's the client (browser) who generates those key, the push service doesn't know about them. Currently there are two keys defined: p256dh which is the P-256 ECDH Diffie-Hellman public key and auth which is the authentication secret. Details of push message encryption (like VAPID) will be described in next post.

Before implementing an action for handling subscription distribution request there is a service needed which will take care of storing the subscriptions. At this point this service can have a very simple interface.

public interface IPushSubscriptionStore
{
    Task StoreSubscriptionAsync(PushSubscription subscription);
}

This service can have many different implementations. The one in demo project is using SQLite, but NoSQL databases sound like a good candidates for storing this kind of data. With service in place the action implementation is quite simple.

namespace Demo.AspNetCore.PushNotifications.Controllers
{
    private readonly IPushSubscriptionStore _subscriptionStore;

    public PushNotificationsApiController(IPushSubscriptionStore subscriptionStore)
    {
        _subscriptionStore = subscriptionStore;
    }

    // POST push-notifications-api/subscriptions
    [HttpPost("subscriptions")]
    public async Task<IActionResult> StoreSubscription([FromBody]PushSubscription subscription)
    {
        await _subscriptionStore.StoreSubscriptionAsync(subscription);

        return NoContent();
    }
}

At this point the cost of push messages becomes visible. First part is storage (all active subscriptions must be stored, and queried as frequently as messages are being send) and second part is computation needed to request the delivery of push message.

Requesting delivery

We already know all the building blocks needed to request push message delivery. Every subscription contains unique information which are needed for creating the request, so they all need to be iterated. I decided to make the IPushSubscriptionStore responsible for the iteration, which should make it easier for memory efficient implementation.

public interface IPushSubscriptionStore
{
    ...

    Task ForEachSubscriptionAsync(Action<PushSubscription> action);
}

There should be also an abstraction for requesting the delivery.

public class PushNotificationServiceOptions
{
    public string Subject { get; set; }

    public string PublicKey { get; set; }

    public string PrivateKey { get; set; }
}

public interface IPushNotificationService
{
    void SendNotification(PushSubscription subscription, string payload);
}

With such API sending push message can be represented as a single call.

await _subscriptionStore.ForEachSubscriptionAsync(
    (PushSubscription subscription) => _notificationService.SendNotification(subscription, "<Push Message>")
);

All the complexity is hiding inside IPushNotificationService implementation. This is also where the computation cost of push messages is. The application must generate the values for VAPID headers based on the options provided and encrypt the message payload based on the keys provided in subscription. The VAPID headers generation can be done once, but message payload has to be encrypted separately for every subscription. That's a lot of cryptography to do.

The push service client implementation is the exact subject of the next post, but this post goal is to have fully working flow so I'm going to use WebPushClient from web-push-libs/web-push-csharp (it's based on draft versions of VAPID and push message encryption, but currently those are still supported) without going into details.

internal class WebPushPushNotificationService : IPushNotificationService
{
    private readonly PushNotificationServiceOptions _options;
    private readonly WebPushClient _pushClient;

    public string PublicKey { get { return _options.PublicKey; } }

    public WebPushPushNotificationService(IOptions<PushNotificationServiceOptions> optionsAccessor)
    {
        _options = optionsAccessor.Value;

        _pushClient = new WebPushClient();
        _pushClient.SetVapidDetails(_options.Subject, _options.PublicKey, _options.PrivateKey);
    }

    public void SendNotification(Abstractions.PushSubscription subscription, string payload)
    {
        var webPushSubscription = WebPush.PushSubscription(
            subscription.Endpoint,
            subscription.Keys["p256dh"],
            subscription.Keys["auth"]);

        _pushClient.SendNotification(webPushSubscription, payload);
    }
}

Receiving

The push message will be delivered directly to the service worker which has been used for registration and will trigger a push event. The payload can be extracted from the event argument and used to display notification.

self.addEventListener('push', function (event) {
    event.waitUntil(self.registration.showNotification('Demo.AspNetCore.PushNotifications', {
        body: event.data.text(),
        icon: '/images/push-notification-icon.png'
    }));
});

The showNotification method has a number of options which impact how the notification will look, you can read about them here.

Unsubscribing

There is one last thing remaining. To be a good web world citizen the application should provide a way for user to unsubscribe from notifications. The process is similar to subscribing. First we should unsubscribe from push service and then discard the subscription on the server side.

function unsubscribeFromPushNotifications() {
    pushServiceWorkerRegistration.pushManager.getSubscription().then(function (pushSubscription) {
        if (pushSubscription) {
            pushSubscription.unsubscribe().then(function () {
                fetch('push-notifications-api/subscriptions?endpoint='
                    + encodeURIComponent(pushSubscription.endpoint),
                    { method: 'DELETE' }
                ).then(function (response) {
                    if (response.ok) {
                        console.log('Successfully unsubscribed from Push Notifications');
                    } else {
                        console.log('Failed to discard the Push Notifications subscription from server');
                    }
                }).catch(function (error) {
                   console.log('Failed to discard the Push Notifications subscription from server: ' + error);
                });

                ...
            }).catch(function (error) {
                console.log('Failed to unsubscribe from Push Notifications: ' + error);
            });
        }
    });
};

To support the discarding of subscription the IPushSubscriptionStore needs to be extended. The endpoint is unique for every subscription so it's can be used as primary key.

public interface IPushSubscriptionStore
{
    ...

    Task DiscardSubscriptionAsync(string endpoint);
}

All that remains is action which will handle the delete request.

namespace Demo.AspNetCore.PushNotifications.Controllers
{
    ...

    // DELETE push-notifications-api/subscriptions?endpoint={endpoint}
    [HttpDelete("subscriptions")]
    public async Task<IActionResult> DiscardSubscription(string endpoint)
    {
        await _subscriptionStore.DiscardSubscriptionAsync(endpoint);

        return NoContent();
    }
}

This is enough to create a nicely behaving web application which uses Push Notifications. As already mentioned the demo application can be found here.

I wasn't expecting that I'll be writing a post about POST Tunneling in 2017 (almost 2018), I thought it's a thing of the past.

Recently a friend of mine reached out to me for advice. His company has delivered a new ASP.NET Core based service to a client. The service was exposing a Web API which (among others) relied on PATCH requests. After the deployment it turned out that one of older applications which were supposed to integrate with the new service wasn't able to issue PATCH requests due to technical limitations. I suggested they check if that old application can issue custom HTTP headers which would allow them to solve the problem with POST Tunneling.

What is POST Tunneling

POST Tunneling is a quite old technic. I've encountered it for the first time in 2012. Back then the issue was very common. A lot of HTTP clients (including XMLHttpRequest in some browsers) weren't providing support for all HTTP methods. Also many corporate networks infrastructures were blocking certain methods. The solution was to tunnel such method through POST request with help of custom header (I believe that X-HTTP-Method-Override was the most frequently used one). The server would examine the incoming POST request and if the header was present its value would be treated as the actual method.

Middleware implementation

The middleware should allow for configuring two things: the name of the custom header and list of methods which can be tunneled.

public class PostTunellingOptions
{
    public string HeaderName { get; set; }

    public IEnumerable<string> AllowedMethods { get; set; }
}

The implementation is very similar to SSL Acceleration Middleware I've done in the past. The heart is IHttpRequestFeature with its Method property. Changing value of that property will trick all the later steps of pipeline to use the new value.

public class PostTunnelingMiddleware
{
    private readonly RequestDelegate _next;

    private readonly string _headerName;
    private readonly HashSet<string> _allowedMethods;

    public PostTunnelingMiddleware(RequestDelegate next, IOptions options)
    {
        // Null checks removed for brevity

        _headerName = options.Value.HeaderName;

        _allowedMethods = new HashSet<string>();
        if (options.Value.AllowedMethods != null)
        {
            foreach (string allowedMethod in options.Value.AllowedMethods)
            {
                _allowedMethods.Add(allowedMethod.ToUpper());
            }
        }
    }

    public Task Invoke(HttpContext context)
    {
        if (HttpMethods.IsPost(context.Request.Method))
        {
            if (context.Request.Headers.ContainsKey(_headerName))
            {
                string tunelledMethod = context.Request.Headers[_headerName];
                if (_allowedMethods.Contains(tunelledMethod))
                {
                    IHttpRequestFeature httpRequestFeature = context.Features.Get<IHttpRequestFeature>();
                    httpRequestFeature.Method = tunelledMethod;
                }
            }
        }

        return _next(context);
    }
}

In order to add POST Tunneling to the application it's enough to register the middleware at the desired position in the pipeline.

public class Startup
{
    ...

    public void Configure(IApplicationBuilder app, IHostingEnvironment env)
    {
        ...

        app.UseMiddleware<PostTunnelingMiddleware>(Options.Create(new PostTunnelingOptions
        {
            HeaderName = "X-HTTP-Method-Override",
            AllowedMethods = new[] { HttpMethods.Patch }
        }));

        app.UseMvc();

        ...
    }
}

I've made the middleware (with some helper extensions) available as a Gist so if any of you ever end up with similar problem, it's out there ready to use.

If you frequently log your requests you might have noticed a presence of Save-Data header (especially if you have a significant amount of traffic from mobile devices). This is not a common header, I've noticed it for the first time when I was playing with Opera in Opera Turbo mode and I've been intrigued by it. It turns out that beside Opera Turbo it's being send by both Chrome and Opera when Data Saver/Data savings option on Android versions of those browsers is enabled. The intent of this header is to hint the server that client would like to reduce data usage. This immediately gave me couple of interesting ideas.

First things first - reading the header from request

Before I could do anything useful with the header I had to get it from the request. The header definition says that its value can consist of multiple tokens, while only one (on) is currently defined. I've decided to represent this with following class.

public class SaveDataHeaderValue
{
    private bool? _on = null;

    public bool On
    {
        get
        {
            if (!_on.HasValue)
            {
                _on = Tokens.Contains("on", StringComparer.InvariantCultureIgnoreCase);
            }

            return _on.Value;
        }
    }

    public IReadOnlyCollection<string> Tokens { get; }

    public SaveDataHeaderValue(IReadOnlyCollection<string> tokens)
    {
        Tokens = tokens ?? throw new ArgumentNullException(nameof(tokens));
    }
}

Now I could create a simple extension method which would grab the raw header value from request, split it, remove any optional white spaces and instantiate the SaveDataHeaderValue.

public static class HttpRequestHeadersExtensions
{
    public static SaveDataHeaderValue GetSaveData(this HttpRequest request)
    {
        if (!request.HttpContext.Items.ContainsKey("SaveDataHeaderValue"))
        {
            StringValues headerValue = request.Headers["Save-Data"];
            if (!StringValues.IsNullOrEmpty(headerValue) && (headerValue.Count == 1))
            {
                string[] tokens = ((string)headerValue).Split(';');
                for (int i = 0; i < tokens.Length; i++)
                {
                    tokens[i] = tokens[i].Trim();
                }

                request.HttpContext.Items["SaveDataHeaderValue"] = new SaveDataHeaderValue(tokens);
            }
        }

        return request.HttpContext.Items["SaveDataHeaderValue"] as SaveDataHeaderValue;
    }
}

I'm also caching the SaveDataHeaderValue instance in HttpContext.Items so parsing happens only once per request.

Dedicated images URLs

My first idea was to be able to define different images sources depending on presence of the hint. I wanted something similar to what link and script Tag Helpers provide in form of asp-fallback-href/asp-fallback-src - an attribute which would contain alternative source. The framework provides a UrlResolutionTagHelper class which can be used as base in order to take care of the URL processing. What left for me was to check if the hint has been sent along the request and if yes replace the original value of src attribute with value from the new attribute (which I've named asp-savedata-src). I've also targeted my Tag Helper only at img elements that have both attributes.

[HtmlTargetElement("img", Attributes = "src,asp-savedata-src",
    TagStructure = TagStructure.WithoutEndTag)]
public class ImageTagHelper : UrlResolutionTagHelper
{
    [HtmlAttributeName("asp-savedata-src")]
    public string SaveDataSrc { get; set; }

    public ImageTagHelper(IUrlHelperFactory urlHelperFactory, HtmlEncoder htmlEncoder)
        : base(urlHelperFactory, htmlEncoder)
    { }

    public override void Process(TagHelperContext context, TagHelperOutput output)
    {
        // Validations skipped for brevity
        ...

        output.CopyHtmlAttribute("src", context);
        if (ViewContext.HttpContext.Request.GetSaveData()?.On ?? false)
        {
            output.Attributes.SetAttribute("src", SaveDataSrc);
        }
        ProcessUrlAttribute("src", output);

        output.Attributes.RemoveAll("asp-savedata-src");
    }
}

This Tag Helper can be used like this.

<img src="~/images/highres.png" asp-savedata-src="~/images/lowres.png" />

Which is exactly what I wanted and I believe looks very elegant. The approach can easily be extended on other media (for example video).

Conditional markup

The second idea was conditional markup generation. There are often areas of a page which doesn't provide important information and serve more decorative purposes. Those areas could be skipped if client has opted for reduced data usage. For this purpose a simple HtmlHelper extension should be enough.

public static class HtmlHelperSaveDataExtensions
{
    public static bool ShouldSaveData(this IHtmlHelper htmlHelper)
    {
        if (htmlHelper == null)
        {
            throw new ArgumentNullException(nameof(htmlHelper));
        }

        return htmlHelper.ViewContext.HttpContext.Request.GetSaveData()?.On ?? false;
    }
}

With this extension such noncrucial areas of the page can be wrapped in an if block.

@if (!Html.ShouldSaveData())
{
    ...
}

This allows for more fine-tuned markup delivery strategy, but this idea can be taken further.

Dedicated actions

Having conditional sections is great but having dedicated views might be better in some cases. The Save-Data header can easily become a part of action selection process. All that is needed is an attribute which implements IActionConstraint interface, which boils down to implementing the Accept method. The Accept method should return true if action is valid for the request.

[AttributeUsage(AttributeTargets.Method, AllowMultiple = false, Inherited = true)]
public class SaveDataAttribute : Attribute, IActionConstraint
{
    private bool _on;

    public int Order { get; set; }

    public SaveDataAttribute(bool on)
    {
        _on = on;
    }

    public bool Accept(ActionConstraintContext context)
    {
        return (context.RouteContext.HttpContext.Request.GetSaveData()?.On ?? false) == _on;
    }
}

Applying attribute to actions having the same action name allows for clean separation between regular and reduced data flow.

public class DemoController : Controller
{
    [SaveData(false)]
    public IActionResult Index()
    {
        return View();
    }

    [ActionName(nameof(Index))]
    [SaveData(true)]
    public IActionResult IndexSavedData()
    {
        return View(nameof(IndexSavedData));
    }
}

This shows the power hiding behind this header. It opens a number of ways to optimize the application for clients which desire it and the samples above are just the simplest usages I could come up with. There is probably a lot more interesting usages that I haven't think of.

Couple more words about broader context

The Save-Data header is part of Client Hints proposal which aims at addressing a need to deliver optimized content for each device. The proposal contains more headers which provide information mostly about display capabilities of client. It also defines a mechanism for advertising supported hints through Accept-CH and Accept-CH-Lifetime headers. As I was going through the specification I've created a simple middleware capable of setting those headers. I'm not aware of any browser supporting those headers, so this is more like a learning example although it has one real-life use. In addition to advertising client hints support it also interacts with Vary header. It's important if the response which can be optimized is also cacheable. In such case the cache needs to know that the hint headers needs to be taken into consideration when choosing response. The middleware will add all the hint headers which has been configured to be supported to the Vary header.

I've put the projects containing the middleware and helpers build around Save-Data header up on GitHub.

Older Posts