My modern development stack

One of the things I do outside of my day job is to build web applications for small businesses. I wanted to share the dev stack I do that in, and why I choose them

Platform

I use .NET Core as my development platform. It’s a modern cross platform framework for building applications. It runs on Windows, Linux, and was natively designed to run in Docker without a lot of effort, and actually has officially supported base Docker images.

Data Access

Data is a complicated piece of the puzzle in any application and can be split up into data storage and application level data access.

For applications that run on AWS or other Linux native environments, I generally go with Postgres for the database and on Windows I generally lean towards Microsoft SQL Server (the Express Edition for most projects since 10 GB is generally more than they’ll ever need)

Inside the application, most of the querying is done using Entity Framework Core, a new ORM developed by Microsoft that handles automatic change tracking and transaction management. If I need to, I’ll combine it with Dapper, which is a micro-ORM for mapping the inputs and outputs of raw SQL queries to models developed by the folks over at StackExchange.

Server-Side Framework

If you guessed ASP.net Core, you are a winner! Natively co-developed with .NET Core, its a powerful, high performance web framework. Shipping with the framework is native Identity management, Powerful MVC framework natively supporting REST, Powerful HTML template language when needed (Razor), and also native development and production support for SPA frameworks such as Angular, React, Vue.js, etc..

Inversion of Control / Dependency Injection

While .NET Core (and by proxy, ASP.net Core) has a built-in IoC container, it’s very simple. However, its designed to be replaced natively by a more powerful container, and supplies abstractions for applications (and libraries) to consume so they aren’t tied directly to the built in container

For new applications, I generally go with Lamar, which was built from the ground up on these abstractions. For older applications, I generally use StructureMap. I will probably try to move those applications to Lamar eventually, since Lamar fixed the memory issues that plagued it, and StructureMap is officially deprecated in favor of Lamar.

Server-Side Validations

ASP.net MVC actually has built in support for model validations (i.e., submitted data actually validates), but it’s not nearly as powerful as I’d like. Instead, I use FluentValidation, which actually integrates with ASP.net MVC and will run before the native validations by default. It’s flexible, integrates with the built in IoC abstractions, and supports async custom validators

CQRS (Command and Query Responsibility Separation)

This is a pattern that I fell in love with a few years back. The general idea is that instead of a bunch of services that combine querying data and commands to update data, you separate them into separate pieces, and potentially use models best suited for the task.

For that, I use MediatR, which is a great little lightweight project.

 

 

 

Entity Framework 6 and const vs static readonly

Note: I haven’t tested this on Entity Framework Core, but I imagine the behavior is the same

Let’s say you have the following class:

public static class BlargTypes {
     public static readonly string Foo = "Bar";
}

and then you have the following LINQ query:

var result = dbContext.Blargs.Where(x => x.Blarg == BlargTypes.Foo);

then Entity Framework will generate a SQL query that looks like this:

SELECT [Id],[Blarg] FROM [Blargs] WHERE [Blarg] = @Blarg

passing in “Bar” as the parameter value for @Blarg

Now, what if the value never changes? There’s really no need for a parameterized value for @Blarg. How do you manage that?

public static class BlargTypes {
    public const string Foo = "Bar";
}

Entity Framework will generate the following query:

SELECT [Id],[Blarg] FROM [Blargs] WHERE [Blarg] = 'Foo'

In summary, if you want a parameterized query, use static readonly, otherwise use const.

Why I dislike the whole “Micro-ORMs for everything” movement

I primarily work on “B2B” applications — for the most part, these are mostly either your basic CRUD operations and fancy reports glued together with fancy business logic.

For the longest time, a technology known as ORMs made CRUD operations a gazillion times more productive — essentially, since you had objects that represented your data, and you wanted to store those objects in database records, an ORM made it stupid simple to Create, Read, Update, and Delete those objects through various technologies.

ORMs have issues though: depending on the complexity of the query, certain classes of queries are “slow”. So, technologies known as Micro-ORMs started popping up: they bypass the query generation capabilities of ORM and instead use raw SQL and then map the results back to objects.

And there-in lies the problem: Writing raw SQL for every single possible CRUD operation is a total PITA. An ORM is fine for your basic CRUD operations. Hell, an ORM is fine for the vast majority of queries an application does.

Micro-ORMs should be an optimization when performance problems arise — not a default way of thinking.

Basically, as a modern .NET developer, Entity Framework or Entity Framework Core are perfectly good tools. They make you dramatically more productive, which means you write more code, faster. Keep in mind, as a software engineer, this is EXACTLY what you are paid to do: deliver product. Dapper is also a tool. It’s a perfectly good tool. It’s suited for a different use case than Entity Framework however. Dapper allows you to drop to the bare metal where it makes sense.

Trust me, that page with a form you persist? You don’t need Dapper. A simple list? You don’t need Dapper. A page that returns a gazillion objects using a query only Satan himself would think is good? Sure, use Dapper. But for simple things? Entity Framework or another full featured ORM should be used first, and Dapper used in specific circumstances where it makes sense.

Note: This post is intended to represent my personal views, and not intended to represent the official opinion of my employer or its affiliates.

Does Fat Fit: 2018 Chevrolet Equinox LT

For people who don’t know me, I’m a rather large guy in 3 dimensions. I’m tall (just about 6’0) and round (north of 400 lbs).

My truck (2010 Ford F-150) recently had its warranty expire, and it was beginning to show problems resulting from both its age (8 years) and the accidents its been in (1 major, 2 minor). It also had HORRIBLE gas mileage on my daily commute (from Hiram, GA to Buckhead in Atlanta, GA — it averaged between 8 mpg and 14 mpg). So.. I convinced the wife to let me trade it in for something more economical.

And so we got the 2018 Chevy Equinox. It’s a compact SUV, same class as your Toyota Rav4 or Honda CR-V.

My fat fits perfectly. It’s a little snug on the sides, but not horribly so. The steering wheel doesn’t touch my gut or my knees. The seat belt actually fits! (It didn’t even in my F-150). I can reach the pedals just fine.

I highly recommend it for other large people looking for a compact SUV.

Enums, C#, and Protocol Buffers

So, in the brave new world of microservice development, there’s this fancy technology called GRPC which allows you to define RPC-based microservices using Protocol Buffers as an IDL syntax and uses HTTP/2 as a transport underneath.

Now, Protocol Buffers’ IDL syntax supports enums. However, it doesn’t allow multiple enum values in different enums in the same namespace to have the same enum value, such as:

enum MyEnum {
FOO = 1;
BAR = 2;
}

enum OtherEnum {
FOO = 1;
BAR = 2;
}

What to do, what to do? The Protobuf compiler will complain, because FOO and BAR are the same name because that isn’t compatible with C++

The suggested alternative is to prefix your enum values with the name of the enum in ALL CAPS

enum MyEnum {
MYENUM_FOO = 1;
MYENUM_BAR = 2;
}

enum OtherEnum {
OTHERENUM_FOO = 1;
OTHERENUM_BAR = 2;
}

The cool thing is that the official C# compiler for Protobuf will convert this to idiomatic C# enums:

enum MyEnum {
Foo = 1;
Bar = 2;
}

enum OtherEnum {
Foo = 1;
Bar = 2;
}

Git: Require Work Item # (Or Similar) In Commit Messages

So here at Videa, we require that Git commits have the TFS Work Item # in the Commit message for linking the work to the Work Item.

In order so that I don’t forget ever again, I wrote a commit-msg hook in Git to keep me honest.

Save the following as “.git/hooks/commit-msg” in your Git Repo. This works on Windows and *nix with any tool that uses the git command line (SourceTree does, however other tools are YMMV)

#!/bin/bash

grep -EHnq "^((Fixes|Resolves|PBI|Bug|Task|Work Item|WI): (\#([0-9])+|None)|Merge branch '(.*)' into)+" "$1"; RET=$?
if [ $RET -eq 0 ]; then
    exit 0
fi

echo "ERROR: Work Item # missing in commit message" 1>&2
exit 1

 

 

Comcast X1 Gigabit vs AT&T Fiber (Gigapower)

I live in unincorporated Paulding County, Georgia in a “Planned Residential Development” subdivision (Ballentine Pointe). It’s one of the few subdivisions in a relatively rural area (Paulding is basically the rural-suburban fringe) that has both AT&T Fiber (thanks to them upgrading BellSouth FTTC infrastructure) and X1 Gigabit (being part of the Atlanta market).

Looking at JUST Internet, here are how the 2 companies compare:

AT&T:

  • Delivery Method: Fiber
  • Speed: 1 Gbps down, 1 Gbps up
  • Included Data Usage: Unlimited
  • Price: $70/mo

Comcast Xfinity:

  • Delivery Method: DOCSIS 3.1 over coax
  • Speed: 1 Gbps down, 35 Mbps up
  • Included Data Usage: 1 TB
  • Price: $139.95/mo

What is Comcast thinking? They are delivering a slower lower quality product with less included data for quite literally twice the cost.

TWICE THE COST.

Then we start looking at bundling TV with the deal….

AT&T offers your choice of U-Verse 450 or DirecTV Premier, with 4 TVs and whole home DVR (and 4K with DirecTV) for less than $200/mo for 2 years (and $250/mo after that)

I’m currently on the X1 HD Complete bundle, which currently costs $200/mo, with a normal price of $250 after 2 years, basically the same deal AT&T offers.

Except that it only includes 105/20 Mbps internet and it has a 1 TB data usage limit.

I could quite literally get Gigabit internet over Fiber, extend my promotional pricing for another 2 years, stop worrying about data usage (we watch a lot of Netflix, Amazon Video, Hulu, etc..) get Disney Junior and Nick Jr. in HD (Xfinity only has them in SD for some reason, they did use to have them in HD, but not anymore… I have a 1 year old after all), just by switching to AT&T.

However, I’ve been a Comcast customer for quite literally half my life at this point. I really don’t want to switch, since its annoying. I just wish a human being at Comcast is willing to realize that they are losing a very valuable high ARPU customer because their pricing is stupid and non-competitive.

However, Comcast… has been disappointing so far. I’m going to try and keep pushing them to actually compete. They have until April for various reasons to do so.

New Job, Yet Again

I ran into a problem many have in the startup world: funding runs dry. As such, I’ve embarked in a change in my career path.

I’m now a Principal Software Engineer at Videa, part of Cox Media Group, a Cox Enterprises business.