Allows Telegram users to submit flood reports via text message chat bot. Part of the CogniCity platform, deployed for Urban Risk Map.
This module deploys two AWS lambda functions:
- A webhook for incoming messages from Telegram
- A reply function to send confirmation messages
npm install
- CogniCity:
- This function is designed to work with Cognicity Server v3.0.6 or later, running CogniCity Schema v3.0.7 or later.
- Node:
- v8.10 or later (ES6 syntax is transpiled using Babel)
- Create Telegram bot as explained here and get
BOT_TOKEN
here. Configure its About, Description, Commands, Profile picture but hold off on setting up webhooks until you have configured the webhook. - Create two AWS API gatway endpoints for the webhook and send functions
- Add the appropriate parameters in
src/config
and.env
. - Use the
commands/setWebook
function to tell Telegram the address of the API gateway - Deploy the functions either by manual upload or edit the
.travis.yml
file to deploy using Travis. - Send a text to your Telegram bot to test if it is up and running!
- Read
Misc Notes
section to assist in configuration
npm run test
Save a copy of sample.env as .env in local directory with appropriate credentials
API_GW_WEBHOOK
: The API gateway address for Telegram to trigger the webhook functionBOT_TOKEN
: Access token created on creating a Telegram botCARDS_API
: CogniCity server endpoint to get unique report card linksCARDS_DECK
: Array of [flood,prep] for what decks should be deployedCARDS_API_KEY
: CogniCity server API keyCARDS_URL
: Client address for cardsPREP_URL
: Client address for prep cardsDEFAULT_INSTANCE_COUNTRY_CODE
: Default country for message files (e.g. 'us')DEFAULT_LANGUAGE
: Current default language is English. You can add more languages here and parameterize replies for each languageMAP_SERVER
: Client address for mapTELEGRAM_ENDPOINT
: Telegram's API
- Depending on your deployment method you may need to add the above parameters to the Lambda functions in the AWS web console