Skip to content

Commit

Permalink
Merge pull request #281 from City-of-Turku/develop
Browse files Browse the repository at this point in the history
Production update
  • Loading branch information
juuso-j committed Jun 15, 2023
2 parents 99840e2 + 77d6de9 commit 5eae113
Show file tree
Hide file tree
Showing 92 changed files with 1,385 additions and 818 deletions.
5 changes: 1 addition & 4 deletions .github/workflows/run-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,7 @@ jobs:
DATABASE_URL: postgis://postgres:postgres@localhost/smbackend
ADDITIONAL_INSTALLED_APPS: smbackend_turku,ptv
PTV_ID_OFFSET: 10000000
BIKE_SERVICE_STATIONS_IDS: service_node=500000,service=500000,units_offset=500000
GAS_FILLING_STATIONS_IDS: service_node=200000,service=200000,units_offset=200000
CHARGING_STATIONS_IDS: service_node=300000,service=300000,units_offset=300000
BICYCLE_STANDS_IDS: service_node=400000,service=400000,units_offset=400000
LAM_COUNTER_API_BASE_URL: https://tie.digitraffic.fi/api/tms/v1/history

steps:
- uses: actions/checkout@v2
Expand Down
6 changes: 5 additions & 1 deletion config_dev.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -180,4 +180,8 @@ YIT_ROUTES_URL=https://api.autori.io/api/dailymaintenance-a3/route/
YIT_VEHICLES_URL=https://api.autori.io/api/dailymaintenance-a3/route/types/vehicle/
YIT_CONTRACTS_URL=https://api.autori.io/api/dailymaintenance-a3/contracts/
YIT_TOKEN_URL=https://login.microsoftonline.com/86792d09-0d81-4899-8d66-95dfc96c8014/oauth2/v2.0/token?Scope=api://7f45c30e-cc67-4a93-85f1-0149b44c1cdf/.default
KUNTEC_KEY=
# API key to the Kuntec API
KUNTEC_KEY=
# Telraam API token, required when fetching Telraam data to csv (import_telraam_to_csv.py)
# https://telraam.helpspace-docs.io/article/27/you-wish-more-data-and-statistics-telraam-api
TELRAAM_TOKEN=
17 changes: 12 additions & 5 deletions eco_counter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

Imports/Processes data from:
https://data.turku.fi/2yxpk2imqi2mzxpa6e6knq
Imports both "Liikennelasketa-Ilmaisintiedot 15 min aikaväleillä"(Traffic Counter) and "Eco-Counter" (Eco Counter) datas. Imports/processes "LAM-Counter" (LAM Counter) data from https://www.digitraffic.fi/tieliikenne/lam/

Imports both "Liikennelasketa-Ilmaisintiedot 15 min aikaväleillä"(Traffic Counter) and "Eco-Counter" (Eco Counter) datas. Imports/processes "LAM-Counter" (LAM Counter) data from https://www.digitraffic.fi/tieliikenne/lam/ and
Telraam data from https://telraam-api.net/.

## Installation:
Add following lines to the .env:
Expand All @@ -16,21 +16,28 @@ Note, The urls can change. Up-to-date urls can be found at:
https://www.avoindata.fi/data/fi/dataset/turun-seudun-liikennemaaria
and
https://www.digitraffic.fi/tieliikenne/lam/
Telraam API token, required when fetching Telraam data to csv (import_telraam_to_csv.py) https://telraam.helpspace-docs.io/article/27/you-wish-more-data-and-statistics-telraam-api
TELRAAM_TOKEN=

## Importing

### Initial Import
The initial import, this must be done before starting with the continous incremental imports:
./manage.py import_counter_data --init COUNTERS
e.g. ./manage.py import_counter_data --init EC TC
The counters are EC(Eco Counter), TC(Traffic Counter) and LC(Lam Counter).
The counters are EC(Eco Counter), TC(Traffic Counter), LC(Lam Counter) and TR(Telraam Counter).

### Continous Import
For continous (hourly) imports run:
./manage.py import_counter_data --counters COUNTERS
e.g. ./manage.py import_counter_data --counters EC TC
Counter names are: EC (Eco Counter), TC (Traffic Counter) and LC (Lam Counter).
Note, Traffic Counter data is updated once a week.
Counter names are: EC (Eco Counter), TC (Traffic Counter), LC (Lam Counter) and TR (Telraam Counter).
Note, Traffic Counter data is updated once a week and Lam Counter data once a day.

### Importing Telraam raw data
In order to import Telraam data into the database the raw data has to be imported. The raw data is imported with the _import_telraam_to_csv_ management command.
The imported should be set to be run once a hour (see: https://github.com/City-of-Turku/smbackend/wiki/Celery-Tasks#telraam-to-csv-eco_countertasksimport_telraam_to_csv )
Telraam raw data is imported to PROJECT_ROOT/media/telraam_data/.

## Troubleshooting
For reasons unknown, the amount of sensors can sometimes change in the source csv file, e.g. the amount of columns changes. If this happens, run the initial import: ./manage.py import_counter_data --init and after that it is safe to run the importer as normal.
Expand Down
21 changes: 8 additions & 13 deletions eco_counter/api/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,15 @@ class StationSerializer(serializers.ModelSerializer):

class Meta:
model = Station

fields = [
"id",
"name",
"name_fi",
"name_sv",
"name_en",
"csv_data_source",
"geom",
"location",
"geometry",
"x",
"y",
"lon",
Expand All @@ -52,18 +52,18 @@ class Meta:
]

def get_y(self, obj):
return obj.geom.y
return obj.location.y

def get_lat(self, obj):
obj.geom.transform(4326)
return obj.geom.y
obj.location.transform(4326)
return obj.location.y

def get_x(self, obj):
return obj.geom.x
return obj.location.x

def get_lon(self, obj):
obj.geom.transform(4326)
return obj.geom.x
obj.location.transform(4326)
return obj.location.x

def get_sensor_types(self, obj):
# Return the sensor types(car, bike etc) that has a total year value >0.
Expand Down Expand Up @@ -100,7 +100,6 @@ class Meta:


class DaySerializer(serializers.ModelSerializer):

station_name = serializers.PrimaryKeyRelatedField(
many=False, source="station.name", read_only=True
)
Expand Down Expand Up @@ -207,7 +206,6 @@ class Meta:


class HourDataSerializer(serializers.ModelSerializer):

day_info = DayInfoSerializer(source="day")

class Meta:
Expand All @@ -229,7 +227,6 @@ class Meta:


class DayDataSerializer(serializers.ModelSerializer):

day_info = DayInfoSerializer(source="day")

class Meta:
Expand All @@ -254,7 +251,6 @@ class Meta:


class MonthDataSerializer(serializers.ModelSerializer):

month_info = MonthInfoSerializer(source="month")

class Meta:
Expand All @@ -267,7 +263,6 @@ class Meta:


class YearDataSerializer(serializers.ModelSerializer):

year_info = YearInfoSerializer(source="year")

class Meta:
Expand Down
2 changes: 0 additions & 2 deletions eco_counter/api/views.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ def list(self, request):


class HourDataViewSet(viewsets.ReadOnlyModelViewSet):

queryset = HourData.objects.all()
serializer_class = HourDataSerializer

Expand All @@ -80,7 +79,6 @@ def get_hour_data(self, request):


class DayDataViewSet(viewsets.ReadOnlyModelViewSet):

queryset = DayData.objects.all()
serializer_class = DayDataSerializer

Expand Down
136 changes: 136 additions & 0 deletions eco_counter/constants.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
import platform
import types

import requests
from django.conf import settings
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.retry import Retry

INDEX_COLUMN_NAME = "startTime"

TRAFFIC_COUNTER_START_YEAR = 2015
# Manually define the end year, as the source data comes from the page
# defined in env variable TRAFFIC_COUNTER_OBSERVATIONS_BASE_URL.
# Change end year when data for the next year is available.
TRAFFIC_COUNTER_END_YEAR = 2023
ECO_COUNTER_START_YEAR = 2020
LAM_COUNTER_START_YEAR = 2010
TELRAAM_COUNTER_START_YEAR = 2023


TRAFFIC_COUNTER = "TC"
ECO_COUNTER = "EC"
LAM_COUNTER = "LC"
TELRAAM_COUNTER = "TR"
TELRAAM_CSV = "TV"

COUNTERS = types.SimpleNamespace()
COUNTERS.TRAFFIC_COUNTER = TRAFFIC_COUNTER
COUNTERS.ECO_COUNTER = ECO_COUNTER
COUNTERS.LAM_COUNTER = LAM_COUNTER
COUNTERS.TELRAAM_COUNTER = TELRAAM_COUNTER

CSV_DATA_SOURCES = (
(TRAFFIC_COUNTER, "TrafficCounter"),
(ECO_COUNTER, "EcoCounter"),
(LAM_COUNTER, "LamCounter"),
(TELRAAM_COUNTER, "TelraamCounter"),
(TELRAAM_CSV, "TelraamCSV"),
)
COUNTER_START_YEARS = {
ECO_COUNTER: ECO_COUNTER_START_YEAR,
TRAFFIC_COUNTER: TRAFFIC_COUNTER_START_YEAR,
LAM_COUNTER: LAM_COUNTER_START_YEAR,
TELRAAM_COUNTER: TELRAAM_COUNTER_START_YEAR,
}

TRAFFIC_COUNTER_METADATA_GEOJSON = "traffic_counter_metadata.geojson"
LAM_STATIONS_API_FETCH_URL = (
settings.LAM_COUNTER_API_BASE_URL
+ "?api=liikennemaara&tyyppi=h&pvm={start_date}&loppu={end_date}"
+ "&lam_type=option1&piste={id}&luokka=kaikki&suunta={direction}&sisallytakaistat=0"
)
# LAM stations in the locations list are included.
LAM_STATION_LOCATIONS = ["Turku", "Raisio", "Kaarina", "Lieto", "Hauninen", "Oriketo"]
# Header that is added to the request that fetches the LAM data.
LAM_STATION_USER_HEADER = {
"Digitraffic-User": f"{platform.uname()[1]}/Turun Palvelukartta"
}
# Mappings are derived by the 'suunta' and the 'suuntaselite' columns in the source data.
# (P)oispäin or (K)eskustaan päin)
LAM_STATIONS_DIRECTION_MAPPINGS = {
# vt8_Raisio
"1_Vaasa": "P",
"2_Turku": "K",
# vt1_Kaarina_Kirismäki
"1_Turku": "K",
"2_Helsinki": "P",
# vt10_Lieto
"1_Hämeenlinna": "P",
# "2_Turku": "K", Duplicate
# vt1_Turku_Kupittaa
# "1_Turku" Duplicate
# "2_Helsinki" Duplicate
# vt1_Turku_Kurkela_länsi
# "1_Turku" Duplicate
# "2_Helsinki" Duplicate
# vt1_Kaarina_Kurkela_itä
# "1_Turku" Duplicate
# "2_Helsinki" Duplicate
# vt1_Kaarina
# "1_Turku" Duplicate
# "2_Helsinki" Duplicate
# vt1_Kaarina_Piikkiö
# "1_Turku" Duplicate
# "2_Helsinki" Duplicate
# yt1851_Turku_Härkämäki
"1_Suikkila": "K",
"2_Artukainen": "P",
# kt40_Hauninen
"1_Piikkiö": "K",
"2_Naantali": "P",
# kt40_Oriketo
# "1_Piikkiö": "K", duplicate
# "2_Naantali": "P", dupicate
}
keys = [k for k in range(TRAFFIC_COUNTER_START_YEAR, TRAFFIC_COUNTER_END_YEAR + 1)]
# Create a dict where the years to be importer are keys and the value is the url of the csv data.
# e.g. {2015, "https://data.turku.fi/2yxpk2imqi2mzxpa6e6knq/2015_laskenta_juha.csv"}
TRAFFIC_COUNTER_CSV_URLS = dict(
[
(k, f"{settings.TRAFFIC_COUNTER_OBSERVATIONS_BASE_URL}{k}_laskenta_juha.csv")
for k in keys
]
)
TELRAAM_COUNTER_API_BASE_URL = "https://telraam-api.net"
# Maximum 3 months at a time
TELRAAM_COUNTER_TRAFFIC_URL = f"{TELRAAM_COUNTER_API_BASE_URL}/v1/reports/traffic"
TELRAAM_COUNTER_CAMERAS_URL = TELRAAM_COUNTER_API_BASE_URL + "/v1/cameras/{mac_id}"

TELRAAM_COUNTER_CAMERA_SEGMENTS_URL = (
TELRAAM_COUNTER_API_BASE_URL + "/v1/segments/id/{id}"
)
# The start month of the start year as telraam data is not available
# from the beginning of the start tear
TELRAAM_COUNTER_START_MONTH = 5
TELRAAM_COUNTER_API_TIME_FORMAT = "%Y-%m-%d %H:%M:%S"
TELRAAM_COUNTER_CSV_FILE_PATH = f"{settings.MEDIA_ROOT}/telraam_data/"
TELRAAM_COUNTER_CSV_FILE = (
TELRAAM_COUNTER_CSV_FILE_PATH + "telraam_data_{id}_{day}_{month}_{year}.csv"
)
TELRAAM_COUNTER_CAMERAS = {
# Mac id: Direction flag (True=rgt prefix will be keskustaan päin)
350457790598039: False, # Kristiinanankatu, Joelle katsottaessa vasemmalle
350457790600975: True, # Kristiinanankatu, Joelle katsottaessa oikealle
}
# For 429 (too many request) TELRAAM need a retry strategy
retry_strategy = Retry(
total=10,
status_forcelist=[429],
method_whitelist=["GET", "POST"],
backoff_factor=30, # 30, 60, 120 , 240, ..seconds
)
adapter = HTTPAdapter(max_retries=retry_strategy)
TELRAAM_HTTP = requests.Session()
TELRAAM_HTTP.mount("https://", adapter)
TELRAAM_HTTP.mount("http://", adapter)
17 changes: 17 additions & 0 deletions eco_counter/management/commands/delete_all_counter_data.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
import logging

from django import db
from django.core.management.base import BaseCommand

from eco_counter.models import ImportState, Station

logger = logging.getLogger("eco_counter")


class Command(BaseCommand):
@db.transaction.atomic
def handle(self, *args, **options):
logger.info("Deleting all counter data...")
logger.info(f"{Station.objects.all().delete()}")
logger.info(f"{ImportState.objects.all().delete()}")
logger.info("Deleted all counter data.")
Loading

0 comments on commit 5eae113

Please sign in to comment.