Crispy is a machine-learning algorithm to make video-games montages efficiently.
It uses a neural network to detect highlights in the video-game frames.
Currently it supports Valorant, Overwatch, CSGO2 and The Finals.
Releases are available for windows and linux.
Firstly, you will have to install ffmpeg (ffprobe is also required).
Once unzip, you can run the setup.[sh|bat] file.
Then you can add your videos in the mp4/mp3 format in the resources folder.
Currently, Crispy supports Python 3.8, 3.9, and 3.10.
You can configure the algorithm in the settings.json file.
It's where you'll change the game.
config example:
{
"neural-network": {
"confidence": 0.8
},
"clip": {
"framerate": 8,
"second-before": 3,
"second-after": 3,
"second-between-kills": 5
},
"stretch": false,
"game": "valorant"
}
The following settings are adjustable:
- neural-network
- confidence: The confidence level used by the neural network. Lower values will include more frames, but might include some false positives.
- clip
- framerate: The framerate of the clip at which the neural network will be applied. Higher values will include more frames, but will take more time to process.
- second-before: Seconds of gameplay included before the highlight.
- second-after: Seconds of gameplay included after the highlight.
- second-between-kills: Transition time between highlights. If the time between two highlights is less than this value, the both highlights will be merged.
- game: Chosen game (either "valorant", "overwatch" or "csgo2")
- stretch: This is an option in case you're playing on a 4:3 resolution but your clips are recorded in 16:9.
I recommend you to use the trials and errors method to find the best settings for your videos.
Here are some settings that I found to work well for me:
{
"neural-network": {
"confidence": 0.8
},
"clip": {
"framerate": 8,
"second-before": 4,
"second-after": 0.5,
"second-between-kills": 3
},
"stretch": false,
"game": "valorant"
}
{
"neural-network": {
"confidence": 0.6
},
"clip": {
"framerate": 8,
"second-before": 4,
"second-after": 3,
"second-between-kills": 5
},
"stretch": false,
"game": "overwatch"
}
{
"neural-network": {
"confidence": 0.7
},
"clip": {
"framerate": 8,
"second-before": 4,
"second-after": 1,
"second-between-kills": 3
},
"stretch": false,
"game": "csgo2"
}
Since the finals is not using the neural network, the settings are a bit different.
The principal problem is that the OCR makes the code very slow.
So I recommend using a framerate of 4 which gave me the ratio between speed and results.
Though, If you want to have better results, you can try to increase the framerate, I would recommend a maximum of 8.
{
"clip": {
"framerate": 4,
"second-before": 6,
"second-after": 0,
"second-between-kills": 6
},
"stretch": false,
"game": "thefinals"
}
You can now run the application with the run.[sh|bat] file.
The frontend is a web-application that allows you to add options to the Crispy algorithm.
It has 5 views:
- Clips
- Segments
- Musics
- Effects
- Result
In the clips view, you can see the list of your videos.
You can rearrange them by dragging and dropping them.
Select the videos you want to make segments of by selecting "show" for that video
Select the videos you want in the montage and add customs effects for a single clip.
Once you've made your selection, you can click on generate segments
to create the segments.
In the segments view, you can see the list of your segments.
Each segment is a gameplay highlight chosen by the algorithm.
You can select "hide" on a segment to exclude that segment from the final result.
In the music view, you can see the list of your music.
This is the music that will be played in the final result video.
You can select "hide" for songs you don't want and you can you can rearrange them by dragging and dropping them.
In the effects view, you can see the list of your effects.
Those effects are applied to the whole video.
Yet the clips' effects override the global effects.
The following effects are available to use:
- blur
- hflip
- vflip
- brightness
- saturation
- zoom
- grayscale
In the result view, you can see the result of your montage.
A: To detect highlights in a video-game, the neural-network searches for things that always happen in a highlight.
For example, in Overwatch, a kill is symbolized by a red skull. So the neural-network will search for red skulls in the frames.
Unfortunately, not all games have such things.
The finals, for example, is a game where you don't have any symbol to represent a kill.
So for those games, the neural-network is not used. Instead, we're using an OCR to detect the killfeed.
The OCR is definitely not as efficient as the neural-network, slow, and depends on the quality of the video.
But it's the best we can do for now.
A: The neural-network has simply not been trained for those games.
If you want to add support for a game, you can train the neural-network yourself and then make a pull request.
A tutorial is available here.
A: Unfortunately, there is nothing you can do.
The neural-network is trained to detect kills in the default UI.
I'm planning to add support for custom UI in the future, but this is definitely not a priority.
A: The algorithm is slow because we're using an OCR to detect the killfeed.
This makes the algorithm very slow, which is why I recommend using a lower framerate for the finals.
In most scenarios, a framerate of 4 is enough to detect all the kills, and increasing the framerate will only increase the processing time, without improving the results.
Every contribution is welcome.
First install pre-commit
by running:
pip install pre-commit
Then to install the git hook run:
pre-commit install -t pre-commit -t commit-msg
Now pre-commit
will run on every git commit
.
cd crispy-frontend && yarn && yarn dev
cd crispy-backend && pip install -Ir requirements-dev.txt && python -m api
cd crispy-api && pytest