DEV Community

Cover image for Migration from Classic Hosting to Serverless
Florian Rappl
Florian Rappl

Posted on • Edited on

Migration from Classic Hosting to Serverless

Cover photo by Chris Briggs on Unsplash

This is finally the year where I'll move away from my dedicated server running now for the last 15 years. It was a fun ride, but maintaining that server costs too much time and money. Before we go into the technical details let's see what I expect from this migration:

  • A more clean code base (finally I can clean up my stuff, maybe remove something and modernize some other parts)
  • No more FTP or messy / unclear deployments - everything should be handled by CI/CD pipelines
  • Cost reduction; sounds weird, but the last thing I want to have is a cost increase (today it's about 30 € per month for the hosting and my goal is to bring this below or close to 10 € - note: I pay much more for domains and these costs are not included here as they will remain the same).

This article is probably going to be the start of a series - and I want to start with something not too simple, but also not difficult at all.

Example: Mostly Static and a Bit Dynamic

Let's consider the following website that I created: html5skript.florian-rappl.de. At its heart its mostly a standard static website. It's purpose was to serve students of mine with lecture material for a course on Web Applications using HTML5 (with CSS3 and JavaScript). Hence it was quite natural to present the online material already in a form that contains everything they'll learn.

The following image shows the website in action:

The website in action

There is one special consideration for this one: to showcase API calls and handling of data the website is searchable via an API that was created using ASP.NET (Webforms). How to proceed here? Well, I'd go with an Azure Static Web App. This is mostly static storage (ideal for the use case), but can also be enriched using Azure Functions. As the search is rather small we can decide if we port over the "standard" C#/.NET code or rewrite the search using JavaScript/Node.js.

Currently, as mentioned, the website is hosted on a dedicated webserver. The webserver is mostly operated from a tool called Plesk, which abstracts away the underlying OS (in this case Windows Server). Importantly, the website is a subdomain of an existing domain (which handles the DNS settings) and has a dedicated database ("html5").

The subdomain area of the website in the Plesk administration tool

While the settings overall are not that bad the database needs to be preserved. Right now, this database is not used in the website, but it still contains relevant information that has to be archived properly. As this is a MySQL database we can access it on the web using the phpMyAdmin tool:

phpMyAdmin

We use the tool to make an export using the "SQL" format. This way, we are able to preserve the contents locally, as well as exporting it to another MySQL database later. We'll come back to this one in another article. For now the important part is that the database needs to be migrated, too - just not for this part of the overall migration.

Next up we'll create a new Azure Static Web App in the free tier. We choose "custom" as deployment option. That way we are free to host / deploy the code as we want.

Azure Static Web App

For my personal homepage I aggregate the code in a free Azure DevOps project. I already created the project in the past, so now I only need to enter and add another repository.

Azure DevOps project

Azure DevOps new repository

Once created I can clone the new repository locally. Afterwards, I've added all the files needed for the static web app part. One recommendation here is to put in some structure. For this website I went with the following:

  • README.md to know what the repository does and how to work with it
  • public folder for the static files that do not require any build step
  • azure.pipelines.yml for setting up the CI/CD pipeline

The files should all be copied to public now. At this time we need to set up the pipeline.

trigger:
- master

pool:
  vmImage: ubuntu-latest

steps:
- task: AzureStaticWebApp@0
  inputs:
    app_location: '/public'
    skip_app_build: true
    skip_api_build: true
    azure_static_web_apps_api_token: '...'
Enter fullscreen mode Exit fullscreen mode

Copying the deployment token from the Azure Static Web App into the pipeline declaration (azure_static_web_apps_api_token) is not secure. A better way is to utilize a variable group for this:

Creating a variable group with a secret

To access this secret you can modify the pipeline to look as follows:

trigger:
- master

variables:
- group: deployment-tokens

pool:
  vmImage: ubuntu-latest

steps:
- task: AzureStaticWebApp@0
  inputs:
    app_location: '/public'
    skip_app_build: true
    skip_api_build: true
    azure_static_web_apps_api_token: '$(html5skript-token)'
Enter fullscreen mode Exit fullscreen mode

Now it's time to see if the static part already works as it should. Going to the URL provided by Azure leads to the website as expected.

Web app working

However, as mentioned, the dynamic part is - at this point in time - non-operational. Let's fix this. As mentioned, the original code was an ASP.NET project that actually read the HTML files and created an index based on their content. Besides implementing a search functionality the API also could deal with inputs such as cos(2), essentially evaluating math expressions. The latter was more of a joke.

As content returned from the given API either JSON string or JSON-P was used. The latter is actually a script that calls a function provided using the callback query parameter. In the "old" days that was a way of circumvent any CORS issues... Luckily, today we can do better.

To keep things simple we'll create a .NET 7-based Azure Function that will be a port of the existing code. For the previous non-existing filesystem retrieval we'll use a static snapshot taken at deployment (as both, the static files and the API are deployed in sync this is fine). The whole thing will be placed in the api directory (in addition to the public folder that we've already created).

We'll start with the csproj file determining the dependencies, snapshot of the files, and Azure Function version:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>net7.0</TargetFramework>
    <AzureFunctionsVersion>v4</AzureFunctionsVersion>
    <OutputType>Exe</OutputType>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.0.13" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Sdk" Version="1.16.2" OutputItemType="Analyzer" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.20.0" />
  </ItemGroup>
  <ItemGroup>
    <None Include="$(ProjectDir)..\public\lectures\**" CopyToOutputDirectory="PreserveNewest" />
    <None Update="host.json">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
    <None Update="local.settings.json">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
      <CopyToPublishDirectory>Never</CopyToPublishDirectory>
    </None>
  </ItemGroup>
</Project>
Enter fullscreen mode Exit fullscreen mode

The associated local.settings.json also confirms the operational mode for the Azure function - in preparation for the longest supported mode we are going into isolated mode:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated"
  }
}
Enter fullscreen mode Exit fullscreen mode

Then I moved over the code - keeping most files as-is, with exception of the search.aspx.cs (code behind). Here, I transformed it into an HTTP trigger function:

public class ApiTrigger
{    
    [Function("search")]
    public HttpResponseData Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequestData req)
    {
        // ...
    }
}
Enter fullscreen mode Exit fullscreen mode

Instead of the older Server.MapPath(...) I just use Path.Join(Environement.CurrentDirectory, ...) assuming that the relevant files (snapshot from the lectures folder of the static assets) will be found at this location. The rest just works as-is, with exception of the response generation. This was previously done using the static Response class - now I could just return the response data:

var response = req.CreateResponse(HttpStatusCode.OK);

if (cb != null)
{
    //Write JSON-P Data
    response.Headers.Add("Content-Type", "text/javascript");
    response.WriteString(value.Stringify(cb));
}
else
{
    //Write JSON Data
    response.Headers.Add("Content-Type", "application/json");
    response.WriteString(value.Stringify());
}

return response;
Enter fullscreen mode Exit fullscreen mode

Much cleaner already (and together with more recent C# features such as spans, ranges, ... - also better performing).

To make the API build we need to remove the skip_api_build property and include the api_location property pointing to the api folder.

trigger:
- master

pool:
  vmImage: ubuntu-latest

variables:
- group: deployment-tokens

steps:
- task: AzureStaticWebApp@0
  inputs:
    app_location: '/public'
    api_location: '/api'
    skip_app_build: true
    azure_static_web_apps_api_token: '$(html5skript-token)'

Enter fullscreen mode Exit fullscreen mode

Finally, I replaced all occurrences of http://html5skript.florian-rappl.de/search.aspx with https://html5skript.florian-rappl.de/api/search. The previous usage of http: was a mistake on multiple levels. While I could keep the whole URL protocol agnostic, I wanted to force HTTPs for now. Additionally, the path has changed from search.aspx to api/search.

Now we can commit and build again. Once everything is up-to-date we do the migration by changing the CNAME in Plesk:

Changing the DNS settings

Don't forgot to also add the custom domain in the Azure Static Web. This way, Azure will create a free certificate for HTTPS and can map the incoming domain to our website.

Conclusion

It runs - faster and more cost efficient (for the given subdomain no additional costs will occur). The crucial part was to identify a cloud provider or way of deployment that aligns pretty well with the application and anticipated usage. In this case the application is mostly static with a small dynamic part running on .NET. Choosing Azure and a free service therein makes sense as I doubt that this website / subdomain is used by too many people. After all, it was mostly targeted towards students and I did not give this (or any other) lecture in almost a decade. The lecture itself continues to be given, but with updated material and with no dependency on this website.

In the next post I'll look into another application that I had to port over; a multiplayer game involving a WebSocket server written from scratch (no SignalR or something else - this dinosaur comes from a time when WebSockets just landed in the browser and .NET did not have any out-of-the-box support yet).

Currently, the dedicated server is still operational - but I need to finish the migration until the end of the year.

Top comments (0)