Skip to content

Open Public Domain Exercise Dataset in JSON format, over 800 exercises with a browsable public searchable frontend

License

Notifications You must be signed in to change notification settings

yuhonas/free-exercise-db

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

84 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Free Exercise DB 💪   Test, Lint & Deploy Site to Github Pages License: Unlicense

Open Public Domain Exercise Dataset in JSON format, 800+ exercises with a browsable public searchable frontend

Why?

I started building another fitness related app and was looking for free/open source exercise lists and imagery I stumbled upon exercises.json which was amazing though the data wasn't structured the way I wanted it and I also wanted a browsable/searchable frontend to the data inspired by this issue so I restructured the data and built a simple frontend to it :)

What do they look like?

All exercises are stored as seperate JSON documents and conform to the following JSON Schema eg.

{
  "id": "Alternate_Incline_Dumbbell_Curl",
  "name": "Alternate Incline Dumbbell Curl",
  "force": "pull",
  "level": "beginner",
  "mechanic": "isolation",
  "equipment": "dumbbell",
  "primaryMuscles": [
    "biceps"
  ],
  "secondaryMuscles": [
    "forearms"
  ],
  "instructions": [
    "Sit down on an incline bench with a dumbbell in each hand being held at arms length. Tip: Keep the elbows close to the torso.This will be your starting position.",
  ],
  "category": "strength",
  "images": [
    "Alternate_Incline_Dumbbell_Curl/0.jpg",
    "Alternate_Incline_Dumbbell_Curl/1.jpg"
  ]
}

See Alternate_Incline_Dumbbell_Curl.json

To further explore the data, you can use lite.datasette.io

How do I use them?

You can check the repo out and use the JSON files and images locally

Alternatively

You can leverage github's hosting and access the single or combined exercises.json and prefix any of image path's contained in the JSON with https://raw.githubusercontent.com/yuhonas/free-exercise-db/main/dist/exercises/ to get a hosted version of the image eg. Air_Bike/0.jpg or leverage something like imagekit.io for dynamic image resizing which is utlized on the frontend example site

Build tasks

There are a number of helpful Makefile tasks that you can utilize

Linting

To lint all the JSON files against the schema.json use

make lint

Combining into a single JSON file

If you make changes to any of the exercises or add new ones, to recombine all single JSON files into a single JSON containing an array of objects using the following make task

make dist/exercises.json

Note: requires jq

Importing into PostgreSQL

To combine all JSON files into Newline Delimeted JSON suitable for import into PostgreSQL use the following make task

make dist/exercises.nd.json

Note: requires jq

See also Importing JSON into PostgreSQL using COPY

Browsable frontend

Screenshot of browsable frontend

There is a simple searchable/browsable frontend to the data written in Vue.js available at yuhonas.github.io/free-exercise-db all related code is in the site directory

Setup

npm install

Compile and Hot-Reload for Development

npm run dev

Compile and Minify for Production

npm run build

Run Unit Tests with Vitest

npm run test:unit

Run End-to-End Tests with Cypress

npm run test:e2e:dev

This runs the end-to-end tests against the Vite development server. It is much faster than the production build.

But it's still recommended to test the production build with test:e2e before deploying (e.g. in CI environments):

npm run build
npm run test:e2e

Lint with ESLint

npm run lint

TODO

Incomplete fields

The following fields are incomplete in some JSON files and in such have had to allow null in schema.json

  • force
  • mechanic
  • equipment

Images

There are also a small number of duplicate images eg.

jdupes --summarize --recurse .

Scanning: 2620 files, 874 items (in 1 specified)
25 duplicate files (in 22 sets), occupying 809 KB

Contributors

Made with contrib.rocks.

Contributions are always welcome! Please read the contribution guidelines first.

Special Thanks 🙇