Over the last few years through teaching and an increasing Full Stack workload I have found more of my day to day devoted to writing Javascript. More often than not if I am working on something outside of native Mobile I am using Javascript. Recently I had cause to create a CRUD API layer to expose a few different SQL stores. I hoped to keep dependencies to a minimum but still provide a capable set of services. I believe I struck the balance well, and will review some of it in this post. It’s a pickle!
My idea is to create a dynamic CRUD (Create, Read, Update, Delete) layer that exposes the operations based on the SQL table schemas. This would produce a stateless set of code in which my requesting app would have zero data model knowledge prior to runtime. I would need to write a script that interacts with the database to fetch table schemas and then generate the corresponding Express routes for each CRUD operation. I use Supabase as the backend provider for a few of my apps so for this example I will use Postgres. The appropriate npm install pg express is required but otherwise it’s all code.
Step 1: Set Up Database Connection
First, I set up a connection object to the Postgres database. One of the more interesting Javascript features I have found is Object Destructing. This allows me to set values on the pg module’s Pool object and manipulate it’s expected values for use on the connection.
const pool = new Pool({ user: 'your_username', host: 'your_host', database: 'your_database', password: 'your_password', port: 5432, // Default Postgres port });
Step 2: Function to Get Table Schema
So now that I have a theoretical DB connection I can create a function to get the schema of a table. This portion is fairly straightforward. I retrieve the schema information for a provided SQL table and map the results to a new kvp object. I grab the pkey because that will be the reference for subsequent -RUD operations.
asyncfunctiongetTableSchema(tableName) { try { // Query to get columns and their data types const columnsResult = await pool.query(`SELECT column_name, data_type FROM information_schema.columns WHERE table_name = $1`, [tableName]);
// Query to get the primary key const pkResult = await pool.query(` SELECT kcu.column_name FROM information_schema.table_constraints AS tc JOIN information_schema.key_column_usage AS kcu ON tc.constraint_name = kcu.constraint_name AND tc.table_schema = kcu.table_schema WHERE tc.table_name = $1 AND tc.constraint_type = 'PRIMARY KEY'`, [tableName]);
This is where most of the heavy lifting is done. This implementation utilizes the primary key to reference objects. The operations construct their appropriate query string and an array of values from the request body. SQL magic happens and then it’s either 404, 500 or 200 with json responses.
try { const result = await pool.query(query, [req.params[primaryKey], ...Object.values(req.body)]); res.json(result.rows[0]); } catch (err) { console.error('Error on UPDATE operation:', err); res.status(500).send('Error on UPDATE operation'); } });
// DELETE app.delete(`/api/${tableName}/:${primaryKey}`, async (req, res) => { try { await pool.query(`DELETE FROM ${tableName} WHERE ${primaryKey} = $1`, [req.params[primaryKey]]); res.send('Record deleted successfully'); } catch (err) { console.error('Error on DELETE operation:', err); res.status(500).send('Error on DELETE operation'); } }); }
Step 4: Use the Functions
Finally, I setup the routes from an array of SQL Table names by calling generateCRUDRoutes for each table. The only other element of import is the PORT as I always prefer a nonstandard port like 8090.
1 2 3 4 5 6 7
const tableNames = ['YourTable1', 'YourTable2']; // Replace with your table names tableNames.forEach(tableName => generateCRUDRoutes(app, tableName));
const PORT = 8090; app.listen(PORT, () => { console.log(`Server is running on port ${PORT}`); });
Overall this is a pretty basic, single file structure. As a POC for a potential implementation in projects I think it checks a lot of the boxes. Future enhancements could abstract the database pool further to allow for flexibility on SQL flavors, hardening around the queries for common injection techniques, allowing for filterable query parameters based on columns outside the pkey and introduction of some standard express middleware(s). Less hardcoding at a minimum.