firebase2graphql: Moving from firebase to realtime GraphQL on Postgres
firebase2graphql -d firebase-data-export.json https://my-hasura-app.com
- Retaining the same query and data model structure as data is migrated from firebase into Postgres
- Automatically normalizing the denormalized firebase data model
- Our motivation behind this tool
- The engineering behind how this was done
Motivation:
1. Firebase realtime vs. GraphQL
- Frontend first development: The ability to iterate on your application querying for precise slices of data as required.
- The client-side SDK that made the developer experience of using Firebase awesome is analogous to the tremendously amazing experience of using a GraphQL client with GraphiQL to browse and explore the GraphQL API.
- Firebase’s realtime features can now be captured as GraphQL subscriptions which offer a similar level of abstraction and convenience to us without having to much about with the underlying sockets, connections and things.
subscription {
user(where: {id: {_eq: 1}}) {
id
name
email
articles {
id
title
content
}
}
}
2. Firebase vs. Postgres
Considering that we have an opportunity to preserve the same developer experience with GraphQL, including realtime features and move to the world’s most advanced open-source database, why not!
The solution
1. Phase I: Exporting data from Firebase to Postgres preserving GraphQL query structure
firebase2graphql firebase-data-export.json https://my-hasura-app.com
- Each node of data maps to a “row”
- For every nested node, we create another table with a relationship to the parent table
- If the nested node, is an “array” type we create what we call an array relationship, basically a one-to-many relationship
- Otherwise, we create a “object” relationship, basically many-to-one relationship
{
"posts": {
"-LMbLFOAW2q6GO1bD-5g": {
"author": "Eena",
"authorPic": "...photo.jpg",
"body": "Content body of this article",
"starCount": 0,
"title": "My first post",
"uid": "4UPmbcaqZKT2NdAAqBahXj4tHYN2"
},
"-LMbLIv6VKHYul7p_PZ-": {
"author": "Eena",
"authorPic": "...photo.jpg",
"body": "Content body of this article",
"starCount": 0,
"title": "Whatta proaaa",
"uid": "4UPmbcaqZKT2NdAAqBahXj4tHYN2"
}
},
"user-posts": {
"4UPmbcaqZKT2NdAAqBahXj4tHYN2": {
"-LMbLFOAW2q6GO1bD-5g": {
"author": "Eena",
"authorPic": "...photo.jpg",
"body": "Content body of this article",
"starCount": 0,
"title": "My first post",
"uid": "4UPmbcaqZKT2NdAAqBahXj4tHYN2"
},
"-LMbLIv6VKHYul7p_PZ-": {
"author": "Eena",
"authorPic": "...photo.jpg",
"body": "Content body of this article",
"starCount": 0,
"title": "Whatta proaaa",
"uid": "4UPmbcaqZKT2NdAAqBahXj4tHYN2"
}
}
},
"users": {
"4UPmbcaqZKT2NdAAqBahXj4tHYN2": {
"email": "[email protected]",
"profile_picture": "...photo.jpg",
"username": "Eena"
}
}
}
users (
_id text not null primary key,
email text,
profile_picture text,
username text
)
posts (
_id text not null primary key,
title text,
body text,
starCount int,
author text,
authorPic text,
uid text
)
user_posts (
_id text not null,
_id_2 text not null,
title text,
body text,
starCount int,
author text,
authorPic text,
uid text
)
Phase II: Automatically normalizing your firebase data
- Look at column names of each table, if another table has similar column names we try to see if they have overlapping data. This is done through a series of SQL queries on the imported data.
- If an exact subset or duplicate table is found, the subset containing table is deleted and any tables related to it, are now changed to be related to the superset table.
Take it for a spin
Related reading