This video is available to students only

Testing With the Database

Testing With the Database

Project Source Code

Get the project source code below, and follow along with the lesson material.

Download Project Source Code

To set up the project on your local machine, please follow the directions provided in the README.md file. If you run into any issues with running the project source code, then feel free to reach out to the author in the course's Discord channel.

Table of Contents
Previous LessonTesting tRPC EndpointsNext LessonTesting Frontend

Lesson Transcript

  • [00:00 - 00:10] So it's fine that we can test our ping endpoint and that works as expected. But what about the other endpoint that accesses the database?

  • [00:11 - 00:18] For that we obviously need to have access to data of some sort. And there are a number of approaches that people take to enable that.

  • [00:19 - 00:30] Some people like to mark their database, which has the huge advantage that it's very performant. And it's also quite easy to specify what data you want to return.

  • [00:31 - 00:43] Others tend to use SQLite as a sort of approximation for their production database, at least if they're not using SQLite in production, which is becoming more and more popular . But obviously that's not what we're doing here.

  • [00:44 - 00:56] And I think it should be obvious already why neither of these two approaches will work for us. We're using TS range, the range type, which is already something that SQLite has no equivalent of.

  • [00:57 - 01:07] So we couldn't use SQLite even as an approximation. And marking data is unfortunate because we already have some sophisticated constraints in our database.

  • [01:08 - 01:21] And we would really like to have test coverage of those because they are quite delicate and we want to make sure that that works as well when testing out back end. So what we need to do is have a Postgres server running.

  • [01:22 - 01:37] And then how does one do that? Well we could have just one test database running for our entire team that they would share and then we would hope that we could be able to write our tests in a way that it can access a test database that is in a sort of undefined state.

  • [01:38 - 01:52] Maybe each test would make sure to create its own little environment or something. But in my opinion, the ultimate solution is to have each test having its own clean or at least well-defined test database set up for it.

  • [01:53 - 02:03] And that's exactly what we're going to do. So the approach that I use is to create a test template database.

  • [02:04 - 02:14] And that is basically just an empty database that has been migrated to the latest version. And then for each test suite, I will clone that database and run the test in there.

  • [02:15 - 02:21] That might sound a bit excessive to you. And it certainly to me, you might be wondering if we couldn't be doing something with transactions instead.

  • [02:22 - 02:31] But the thing is, each of our requests create a transaction themselves. And since Postgres doesn't have a concept of nested transactions yet, we can't do that.

  • [02:32 - 02:35] That might come in the future and I hope it will. And that might make things easier.

  • [02:36 - 02:39] But for now, we can't do that. There's something called save points.

  • [02:40 - 02:48] But those are made within the transaction so those don't work either. This approach, however, has worked for me and it works surprisingly well.

  • [02:49 - 02:53] Postgres is very fast at cloning databases. So this actually doesn't seem to be a problem.

  • [02:54 - 03:06] I would at least recommend you try and start with it. And then at the time that you've realized that this is becoming a problem for you, at that time, maybe they have implemented nested transactions or maybe you can think of a different approach to help your performance.

  • [03:07 - 03:18] But I recommend that you try and start with this. Now we have a slight architectural problem because our tests in the back end are expecting a test database which is migrated.

  • [03:19 - 03:25] But our migrations live in the schema package and they don't actually live in the source folder. They're not part of what's distributed.

  • [03:26 - 03:36] So we can't really access them from the back end. It doesn't matter that much though because we're using next and next just takes a folder as an argument to where they should find the migration.

  • [03:37 - 03:45] So we just need to provide it with the folder name of where the schema sits. We could obviously just hard code this or just make a relative path.

  • [03:46 - 04:05] But I've typically used a slightly less hacky approach which is to create the directory name as a variable in the schema. So let's try and create a file called schema.der name in the source code here in the schema package.

  • [04:06 - 04:18] And I'll use two underscores to indicate that this is a hacky thing that isn't really part of the exposed API otherwise. And I will paste this in from the article.

  • [04:19 - 04:32] So this exposes a variable called schema.der name which is the DER name that node provides us with. So that will be the folder of the schema package.

  • [04:33 - 04:42] So when we have that in place we can just go into the migrations folder. So we will be able to reference this from our back end package.

  • [04:43 - 04:53] With that in place let's start setting up our test help person in the back end. I'm going to create a folder called test help person because there's going to be a few things here.

  • [04:54 - 04:59] And this is a convention I typically follow that's quite common. So I have this test help person folder.

  • [05:00 - 05:22] I'm going to create first a new file called get next for.ts which is a little helper function that will give us a next connection to the same database that we had before but sorry the same connection that we had before but with a specific database. So we can specify the database name as a string here and it will return that instance.

  • [05:23 - 05:33] We're going to be needing this to connect to test databases as we create them. Now that we have those two things we can create a global setup file.

  • [05:34 - 05:52] So let's create that global setup.ts and I'm going to paste this in here. So let me just point out that there's an error here and the reason for that is probably that I didn't build my schema.

  • [05:53 - 06:14] So do I have I think I need to run TSC like so. Let's see if this helps.

  • [06:15 - 06:19] You can see the file there. So I would expect this to.

  • [06:20 - 06:27] It doesn't seem to be discovering it. Let me just try and restart my TypeScript server and see if that helps.

  • [06:28 - 06:31] Sometimes you have to do that. There we go.

  • [06:32 - 06:38] So this is now set to let's take a look what it actually says. I'm just going to go to her name.

  • [06:39 - 06:51] So we now have that imported here and so let me just go over what this global setup function does. There's a global setup function that VTest is going to be using.

  • [06:52 - 07:07] So that means this is a test function that will be called once when I run tests , not for each test or for each test suite only once. So what I want this thing to do is to make sure that there is a test template database.

  • [07:08 - 07:20] So I start by getting the next connection for the Postgres database and that is the sort of meta database that always exists. And the reason why I'm just getting a connection for that is I'm going to be creating a database down here.

  • [07:21 - 07:28] And we really just need a connection to any database to do that. And the Postgres one is just the one that is always in place.

  • [07:29 - 07:37] We call our template database easy booking test template. So what this does here is it drops that if it exists and then it creates it.

  • [07:38 - 07:48] We then destroy the next connection for the Postgres database. And then we create a new next connection for our new template database.

  • [07:49 - 07:57] We then figure out what the migration folder should be. And we're doing that by getting the schema, the name, which will reside in source, but we don't want it to be in there and then slash migrations.

  • [07:58 - 08:06] So this will be pointing to this folder here, which is where our migrations are . And we want next to just run those.

  • [08:07 - 08:17] So we do that by simply specifying, telling it to migrate latest with this migration folder. We then destroy the test template connection.

  • [08:18 - 08:26] And then we print out some performance information. I like to do this just to keep track of how long are my tests taking?

  • [08:27 - 08:34] Because if my test template database is starting to get big, maybe this is going to become an issue. And I want to just keep track of it.

  • [08:35 - 08:44] So I always print this info out just to have it available everywhere. Now in order to make this global setup work, we need to configure VTest.

  • [08:45 - 08:51] And so we don't have a VTest configuration file yet. I'm going to create one here in the back end.

  • [08:52 - 09:01] It's going to be called vtest.config.ts. And it will contain the following.

  • [09:02 - 09:07] So this is copied from the documentation. So the define config comes from VET for whatever reason.

  • [09:08 - 09:16] I guess that's because VTest is built on top of VET. And we're defining it to say, we want this global setup file to be run.

  • [09:17 - 09:25] We're also defining that sequence, well, the sequence of hooks should be a stack. And I'm going to get back to that a little bit later why this is important.

  • [09:26 - 09:41] But for now, we'll just assume that I'm right when I say that it is. So with this in place, let's try and run our back end tests.

  • [09:42 - 09:48] And see what happens. Test DB template set up 0.44 seconds.

  • [09:49 - 09:56] So this isn't very interesting yet because none of our tests had changed. But you can see that it does at least something to the test template.

  • [09:57 - 10:02] So let me try and show you here in my Postgres Explorer. Oh, actually, yeah, there we go.

  • [10:03 - 10:07] There's the test template. And it should be migrated the same way as our regular database.

  • [10:08 - 10:13] You can see all the tables here. So that gets us part of the way.

  • [10:14 - 10:24] But there's another two things we need to do before we can write tests that rely on a database. First of all, we need to create something that will clone this template database for our test suites.

  • [10:25 - 10:38] And secondly, we need to make sure that our back end will actually connect to the test database whenever it's running a test. So in order to do that, let's go into our getDB function here.

  • [10:39 - 10:44] This is what we call from our back end in order to get the database connection. And as you can see, it uses a singleton here.

  • [10:45 - 10:54] So there's an instance called underscore DB that is just returned unless it doesn't exist. And if it doesn't exist, it will connect to the easy booking database.

  • [10:55 - 10:59] So this will always give us a connection to a database called easy booking. And we need to change that.

  • [11:00 - 11:16] We're going to simply add a function here that we will be using for tests called attach DB, which will take a next instance. And it returns void.

  • [11:17 - 11:26] And it will just say underscore DB equals to DB. So this will basically just override this singleton with whatever we want it to .

  • [11:27 - 11:41] So now we have a little utility that we can use from our tests to make sure that we connect to the database that we want. So now we need to create a function that will actually clone the test template database into our specific test database.

  • [11:42 - 11:54] Obviously, we also need to destroy the database after the test has run because we don't want to be flooding our Postgres server with databases every time we run tests. So that's quite a bit of plumbing code we need for setting up our test fixtures .

  • [11:55 - 12:07] And we would like to abstract that into something. So we just like just has these before each and after each as well as before all and after all calls that you can use to help set up test fixture.

  • [12:08 - 12:22] And that gets us some of the way, but I like to abstract these calls into a function of its own so that you just have one line of setup for a specific piece of test fixture. I tend to refer to this as test hooks.

  • [12:23 - 12:35] It's because it sort of resembles React hooks a little bit in the way that a React hook can offer some setup and tear down code with say a use effect call. But I want to stress that this has nothing to do with React hooks other than that.

  • [12:36 - 12:51] So if you get confused by naming these the way that you name React hooks, go ahead and use some other naming convention. I just tend to call my test fixture hooks use and then whatever they're using.

  • [12:52 - 13:01] So we're going to now create a test hook called use test database. This isn't also a test helper.

  • [13:02 - 13:08] So let's create it in the same folder here. Use test database.

  • [13:09 - 13:13] And let me quickly show you what this does. So the hook is defined here.

  • [13:14 - 13:25] And as you can see here, it basically wraps up before all and after all a call. So it has a little instance called DB, which is the next connection that we're creating.

  • [13:26 - 13:41] And in the before all, what we do, first of all, we start some performance monitoring because similar to when we're cloning the test database, we want to keep track of if this is starting to take a long time. And here I don't print out information every time we run because that would give a lot of noise in the console.

  • [13:42 - 13:55] So I only print out something if it took more than two seconds to set up because then it's probably something we want to deal with. But basically what it does is it establishes a connection to our test template database.

  • [13:56 - 14:03] And then it creates a new name for our new database. So we could just call this easy booking test DB or something like that.

  • [14:04 - 14:16] But I like to create an explicit name that is based on the test suite that we 're inside. You can see I have a little helper function here which is defined up here that uses the VTest function to get the name of the suite that's running.

  • [14:17 - 14:42] And what's nice about this is first of all, it avoids any race conditions or anything if we have tests running in parallel. But secondly, if we decide to call this hook with preserve after test set to true, then it won't destroy the database when it's done, which means that if we're debugging some code, we can inspect the database afterwards to see what the state of things were.

  • [14:43 - 14:57] And for that reason, it's nice to have an explicit database name. So when we have the connection to the template, we create, well, we drop if it should exist first the name of our new test database and then we create it and we create it with this template.

  • [14:58 - 15:06] And this is the way in Postgres you clone a database. So that's basically cloning the template into this new test database.

  • [15:07 - 15:18] We then create a next connection to that and we call the attached DB to this new connection. So now the backend is going to connect to our test database when it does anything in the database.

  • [15:19 - 15:25] So why do we also need to store it here? Well, basically down here you can see I return a closure that will give us this instance.

  • [15:26 - 15:43] And the reason for that is that it might be nice for a test to run a little piece of SQL to add a member or to change the name or something that we don't want to go through an endpoint in order to do. And then it's helpful to have the database connection handy as well.

  • [15:44 - 15:59] The after all method should be fairly straightforward to understand now. We're first of all destroying the database connection and then we are dropping our database if we're not preserving it and then we also destroy the template connection.

  • [16:00 - 16:10] So with this, let's try and create a test for our get service types endpoint. Let me go into our test file here.

  • [16:11 - 16:18] So we want to add another it block here. But first I want to show you where I'm using this hook and that is up here.

  • [16:19 - 16:33] So this is where I'm calling it in the describe block, not in the it or test function itself. That actually wouldn't work because what use test database does is it calls before all and after all.

  • [16:34 - 16:37] And those aren't available in sight. Well, they're available, but they don't do anything in here.

  • [16:38 - 16:46] So we need to call it here. And what this means is now that our test database is available to all of the tests that run inside of this describe block.

  • [16:47 - 17:01] And maybe I should note here that I'm not sure it would work if you have nested describe blocks and use nested use test database calls. It probably would, but I think the parent test database wouldn't be re established when you exit an inner describe block.

  • [17:02 - 17:07] So it hasn't been completely completed for that sake. If you want to do so, go ahead.

  • [17:08 - 17:19] But probably I will be turning this into an npm package at some point because I also don't want to keep writing this in all of my code. But for now it works for a single describe blocks like so.

  • [17:20 - 17:33] So let's create another test here for our get service types. So it should return the existing service types.

  • [17:34 - 17:41] And we're going to call that. So we'll get a caller like we did up here.

  • [17:42 - 17:53] This is maybe important to mention that this is a way to get a TRPC endpoint sort of short circuit. So I mentioned before that we're doing integration tests of our back end.

  • [17:54 - 18:17] And what I mean by that is that it integrates the database with the endpoints, but it's not a full end to end, if you will, of the back end in that we're not going through CoA and there is no HTTP involved here. And this is important because it means, for instance, that something like not setting up the transformer to super JSON like we did early on might not be something you would catch with these tests.

  • [18:18 - 18:27] So you should definitely do some end to end tests that go through HTTP and everything as well. But for these tests, this is what we're going to be doing.

  • [18:28 - 18:39] So I'm going to get the service types from our caller. And you can see I have this available here.

  • [18:40 - 19:13] And since our test database doesn't actually contain anything, we're going to expect this to be an empty array, X, PECT. And obviously that isn't very interesting, but what we're testing here is mostly that nothing crashes, but also this will at least make sure that we're not referencing our developer database, which does in fact have something in it.

  • [19:14 - 19:24] Let's try and run tests. All right, both tests succeed and we get an empty array for our service types.

  • [19:25 - 19:32] Let's try just for fun and set preserve after test to true. Before we do that, let's just inspect here what we have.

  • [19:33 - 19:37] So we have the testing database template. And that is it.

  • [19:38 - 19:42] Let's try and run the tests again. Oh, I forgot to save.

  • [19:43 - 19:51] And then let's refresh this. And you can see we now have test underscore app router.

  • [19:52 - 19:58] And that's because the describes as app router here. And that should have everything else in it.

  • [19:59 - 20:08] Nothing in any of the tables because we didn't create anything, but at least it has been a proper clone. So far so good.

  • [20:09 - 20:16] But obviously we want to test something that's more interesting than just an empty array. We need some data in our test database.

  • [20:17 - 20:23] And there are a couple ways we can go about this. So one is that we could use the DB instance that's returned from the hook.

  • [20:24 - 20:36] And we could insert some data either at the top of our test or maybe in a before all or before each call or something like that. Alternatively, we could create an entire system for seeding our test database.

  • [20:37 - 20:50] And this is probably something we would want to do when we start to create a more complex product. If you need a bunch of data in a number of different tables in order to establish a realistic testing scenario, seed data is the way to go.

  • [20:51 - 21:02] But we're not going to get into that just yet at least. So the last thing we can do is that we can take advantage of the fact that our test database survives the entire describe block.

  • [21:03 - 21:27] So that means that if one test does something that inserts data in the database , the text that follows it will be able to read that data and make assertions on it. Some purists might argue that this is actually an anti-pattern because each test should be completely a damp and able to run on its own, not depend on any other tests and no order should be required.

  • [21:28 - 21:53] I think it's okay though to do this in a describe block because those are still small isolated pieces of code and having one test in a describe block rely on a different test in same describe block I think is acceptable. So what we can do is we can create another endpoint that we would probably need anyway called create service type which we will then create a test for that will insert a service.

  • [21:54 - 22:07] And then after that we can create another test that will test that the get service types returns the new service type that has just been created. So let's start by creating this create service type endpoint.

  • [22:08 - 22:29] We'll go into our app router again and we'll expand this down here. Let me just paste some code that I have here which relies on a create service type input that is another zod object so let me just paste that down here.

  • [22:30 - 22:43] So create service type takes a name and a description as its input. We could use the initializer object that we have created by kennel but this is so simple that let's just do it here at least for now.

  • [22:44 - 23:04] So we specify that this is the input we want to take. So this is the first time we're actually doing runtime validation of parameters to an endpoint but these two have to be strings when we call our create service type and then we're outputting a service type which is just the thing that we were importing from our model.

  • [23:05 - 23:13] And the endpoint itself is very trivial. It uses the transaction to access the service type table and it inserts the input.

  • [23:14 - 23:26] And since that just has a name and description that will just work as it is. And then we return the inserted object so that will be the result and that is a service type which is what comes out of create service type.

  • [23:27 - 23:39] So with that in place let's try and update our test. So I'll go into our test here and then we will create a test for this creation endpoint.

  • [23:40 - 23:50] So I am going to paste some code here. This is actually it should return the object.

  • [23:51 - 23:54] There we go. So this should create the service type and return the object.

  • [23:55 - 24:01] So we create a caller like we did the other times and we call this. And now we have to specify the parameters here.

  • [24:02 - 24:18] And so you can see that it's determined that the parameters are of the correct type. And then what comes out of that is a service type and then we expect that to have the ID one and then the name of the description that we've given it.

  • [24:19 - 24:29] So we don't really care about the ID but since we are running in an empty test database this should be the first object. So we would expect it to have ID one.

  • [24:30 - 24:44] And we're going to assert that here just because if it happens that it creates something with a different ID at one point that means that something has changed and we will probably want to be alerted about that. So this is a good way to just do that.

  • [24:45 - 24:58] We could remove it because it isn't really asserting anything of semantic value but I think it still adds a little bit of value here. Then we can add the final check, which is where the...

This lesson preview is part of the Fullstack Typescript with TailwindCSS and tRPC Using Modern Features of PostgreSQL course and can be unlocked immediately with a \newline Pro subscription or a single-time purchase. Already have access to this course? Log in here.

Unlock This Course

Get unlimited access to Fullstack Typescript with TailwindCSS and tRPC Using Modern Features of PostgreSQL, plus 70+ \newline books, guides and courses with the \newline Pro subscription.

Thumbnail for the \newline course Fullstack Typescript with TailwindCSS and tRPC Using Modern Features of PostgreSQL