diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
new file mode 100644
index 0000000..545b8f8
--- /dev/null
+++ b/.pre-commit-config.yaml
@@ -0,0 +1,38 @@
+# See https://pre-commit.com for more information
+repos:
+- repo: https://github.com/pre-commit/pre-commit-hooks
+ rev: v4.1.0 # Python 3.6 compatible
+ hooks:
+ # Python related checks
+ - id: check-ast
+ - id: check-builtin-literals
+ - id: check-docstring-first
+ - id: name-tests-test
+ name: Check unit tests start with 'test_'
+ args: ['--django']
+ files: 'test/.*'
+ # Other checks
+ - id: check-added-large-files
+ - id: check-case-conflict
+ - id: check-merge-conflict
+ - id: check-yaml
+ - id: debug-statements
+ - id: detect-private-key
+ - id: end-of-file-fixer
+ - id: mixed-line-ending
+ name: Force line endings to LF
+ args: ['--fix=lf']
+ - id: trailing-whitespace
+
+- repo: https://github.com/pre-commit/pygrep-hooks
+ rev: v1.10.0
+ hooks:
+ - id: python-check-mock-methods
+ - id: python-no-eval
+ - id: python-no-log-warn
+ - id: python-use-type-annotations
+
+# Pre-commit CI config, see https://pre-commit.ci/
+ci:
+ autofix_prs: false
+ autoupdate_schedule: quarterly
diff --git a/docs/setting_up_VM_with_app.md b/docs/setting_up_VM_with_app.md
new file mode 100644
index 0000000..254bf0f
--- /dev/null
+++ b/docs/setting_up_VM_with_app.md
@@ -0,0 +1,27 @@
+## Steps to follow to get a VM with monitoring app running on Apache
+
+To get a 'prototype' of the monitoring app running with Apache, follow these steps:
+- create a cloud VM of the type: scientific-linux-7-aq
+- continue by selecting sandbox 'testing_personality_2', archetype 'ral-tier1', personality 'apel-data-validation-test'
+
+Allow 15 minutes after the machine is created, then remember to edit security groups from OpenStack to allow Apache to work.
+Then follow these steps from within the machine:
+- quattor-fetch && quattor-configure --all
+- cd /usr/share/DJANGO_MONITORING_APP/monitoring
+- modify the file settings.py, specifically the dict called DATABASES, to include the right credentials and database names
+- cd ..
+- source venv/bin/activate
+- systemctl restart httpd
+- sudo chown apache .
+
+At this point the app should be working, so just get the ip address by writing "hostname -I" within the machine and the app should be already running at that address.
+
+
+## What to do if the app seems to stop working after closing the VM
+If the VM is shut down, next time we try to open the app, Apache might give the error message "Unable to open the database file".
+If this happens, just follow these steps on the machine:
+1. cd /usr/share/DJANGO_MONITORING_APP
+2. source venv/bin/activate
+3. sudo chown apache .
+
+Note that step 2 is necessary for step 3 to work and the error message to disappear.
diff --git a/docs/what_gets_installed.md b/docs/what_gets_installed.md
new file mode 100644
index 0000000..38d7d38
--- /dev/null
+++ b/docs/what_gets_installed.md
@@ -0,0 +1,23 @@
+For Django to work with apache, it is common to have a venv within the app, where django and djangorestframework get installed. Other packages needed for the app to work are installed by Aquilon outside the venv.
+
+## Packages installed by Aquilon outside the venv
+Following the config file that Aquilon uses, the following are the packages installed:
+- `httpd`
+- `python3-mod_wsgi` (for apache to work with django)
+- `python3-devel`
+- `gcc` (needed for dependencies)
+- `mariadb`
+- `tar`
+
+## Packages installed within the venv
+Within venv, the following are installed through pip:
+- `djangorestframework` (3.15.1)
+- `pymysql` (1.0.2) (needed for mariadb to work)
+- `pandas` (1.1.5) (needed by the app)
+- `django` (3.2.25)
+- `pytz` (2025.2)
+
+Note that when the version of the packages is specified, the app would not work with a different version (due to dependencies conflicts).
+
+Is is also important to note that different types of OS require different packages to be installed.
+The above are the packages that allow the app to work on a Rocky8.
diff --git a/docs/what_pages_can_be_accessed.md b/docs/what_pages_can_be_accessed.md
new file mode 100644
index 0000000..6f83fb7
--- /dev/null
+++ b/docs/what_pages_can_be_accessed.md
@@ -0,0 +1,22 @@
+## Pages that can be accessed through the app
+
+The following urls are the ones that can be accessed without passing any parameter:
+- http://ip-address/publishing/cloud/
+- http://ip-address/publishing/gridsync/
+
+These pages show info for a number of sites, so do not require a site name to be specified within the url.
+
+The url http://ip-address/publishing/grid/ , instead, should be used together with the name of the site we are looking for.
+For example: http://ip-address/publishing/grid/BelGrid-UCL/
+It is not supposed to be used without passing the name of the site.
+
+The url http://ip-address/publishing/gridsync/ shows a sync table, and it's probably the most important bit of the personality 'apel-data-validation'.
+This table contains data related to many sites, specifically number of jobs being published vs in the db, and this number is shown for every (available) month of each site.
+
+Clicking on any name in the first column (containing site names) allows to access a similar table which only shows the data relative to that site.
+This more specific table is such that the first columns shows the months for which we have data (for that site).
+Clicking on the month allows to open another table that shows data for that month and site only, divided by submithost.
+
+The pages accessed through the links can of course be accessed by typing directly the url. For example, if we want data related to the site 'CSCS-LCG2' and month '2013-11', we would type :
+http://ip-address/publishing/gridsync/CSCS-LCG2/2013-11/
+However, in this case if there is no data for the month we are looking for, we would get an error.
diff --git a/monitoring/__init__.py b/monitoring/__init__.py
index e69de29..063cd2c 100644
--- a/monitoring/__init__.py
+++ b/monitoring/__init__.py
@@ -0,0 +1,2 @@
+import pymysql
+pymysql.install_as_MySQLdb()
diff --git a/monitoring/availability/apps.py b/monitoring/availability/apps.py
index c7eacd8..53f90fa 100644
--- a/monitoring/availability/apps.py
+++ b/monitoring/availability/apps.py
@@ -5,4 +5,4 @@
class AvailabilityConfig(AppConfig):
- name = 'availability'
+ name = 'monitoring.availability'
diff --git a/monitoring/availability/templates/status.html b/monitoring/availability/templates/status.html
new file mode 100644
index 0000000..9b504ca
--- /dev/null
+++ b/monitoring/availability/templates/status.html
@@ -0,0 +1 @@
+{{ message }}
diff --git a/monitoring/availability/urls.py b/monitoring/availability/urls.py
index 0c9a403..077876e 100644
--- a/monitoring/availability/urls.py
+++ b/monitoring/availability/urls.py
@@ -1,7 +1,7 @@
-from django.conf.urls import url
+from django.urls import path
-import views
+from monitoring.availability import views
urlpatterns = [
- url(r'^$', views.status),
+ path('', views.status, name='availability'),
]
diff --git a/monitoring/availability/views.py b/monitoring/availability/views.py
index 380575e..eff3094 100644
--- a/monitoring/availability/views.py
+++ b/monitoring/availability/views.py
@@ -1,15 +1,10 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
-import time
-
from rest_framework.decorators import api_view
from rest_framework.response import Response
@api_view()
def status(requst):
- if int(time.time()) % 2:
- return Response("Everything OK")
- else:
- return Response("Everything NOT ok.")
+ return Response({"message": "OK"}, status=200, template_name="status.html")
diff --git a/monitoring/benchmarks/__init__.py b/monitoring/benchmarks/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/monitoring/benchmarks/admin.py b/monitoring/benchmarks/admin.py
new file mode 100644
index 0000000..8c38f3f
--- /dev/null
+++ b/monitoring/benchmarks/admin.py
@@ -0,0 +1,3 @@
+from django.contrib import admin
+
+# Register your models here.
diff --git a/monitoring/benchmarks/apps.py b/monitoring/benchmarks/apps.py
new file mode 100644
index 0000000..bb0e0b3
--- /dev/null
+++ b/monitoring/benchmarks/apps.py
@@ -0,0 +1,5 @@
+from django.apps import AppConfig
+
+
+class BenchmarksConfig(AppConfig):
+ name = 'monitoring.benchmarks'
diff --git a/monitoring/benchmarks/migrations/__init__.py b/monitoring/benchmarks/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/monitoring/benchmarks/models.py b/monitoring/benchmarks/models.py
new file mode 100644
index 0000000..72f9f64
--- /dev/null
+++ b/monitoring/benchmarks/models.py
@@ -0,0 +1,54 @@
+from django.db import models
+
+
+class BenchmarksBySubmithost(models.Model):
+ fetched = models.DateTimeField(auto_now=True)
+ SiteName = models.CharField(max_length=255)
+ SubmitHost = models.CharField(max_length=255)
+ BenchmarkType = models.CharField(max_length=50)
+ BenchmarkValue = models.DecimalField(max_digits=10, decimal_places=3)
+ RecordType = models.CharField(max_length=50)
+ UpdateTime = models.DateTimeField()
+
+ class Meta:
+ ordering = ('SiteName',)
+
+
+class VJobRecords(models.Model):
+ Site = models.CharField(max_length=255, primary_key=True)
+ SubmitHost = models.CharField(max_length=255)
+ ServiceLevelType = models.CharField(max_length=50)
+ ServiceLevel = models.DecimalField(max_digits=10, decimal_places=3)
+ UpdateTime = models.DateTimeField()
+ EndTime = models.DateTimeField()
+
+ class Meta:
+ managed = False
+ db_table = 'VJobRecords'
+ verbose_name = 'Job Record'
+
+
+class VSummaries(models.Model):
+ Site = models.CharField(max_length=255, primary_key=True)
+ SubmitHost = models.CharField(max_length=255)
+ ServiceLevelType = models.CharField(max_length=50)
+ ServiceLevel = models.DecimalField(max_digits=10, decimal_places=3)
+ UpdateTime = models.DateTimeField()
+
+ class Meta:
+ managed = False
+ db_table = 'VSummaries'
+ verbose_name = 'Summary'
+
+
+class VNormalisedSummaries(models.Model):
+ Site = models.CharField(max_length=255, primary_key=True)
+ SubmitHost = models.CharField(max_length=255)
+ ServiceLevelType = models.CharField(max_length=50)
+ ServiceLevel = models.DecimalField(max_digits=10, decimal_places=3)
+ UpdateTime = models.DateTimeField()
+
+ class Meta:
+ managed = False
+ db_table = 'VNormalisedSummaries'
+ verbose_name = 'Normalised Summary'
diff --git a/monitoring/benchmarks/serializers.py b/monitoring/benchmarks/serializers.py
new file mode 100644
index 0000000..e7eb7cc
--- /dev/null
+++ b/monitoring/benchmarks/serializers.py
@@ -0,0 +1,21 @@
+from rest_framework import serializers
+
+from monitoring.benchmarks.models import BenchmarksBySubmithost
+
+
+class BenchmarksBySubmithostSerializer(serializers.HyperlinkedModelSerializer):
+ # Override default format with None so that Python datetime is used as
+ # ouput format. Encoding will be determined by the renderer and can be
+ # formatted by a template filter.
+ UpdateTime = serializers.DateTimeField(format=None)
+
+ class Meta:
+ model = BenchmarksBySubmithost
+ fields = (
+ 'SiteName',
+ 'SubmitHost',
+ 'BenchmarkType',
+ 'BenchmarkValue',
+ 'RecordType',
+ 'UpdateTime',
+ )
diff --git a/monitoring/benchmarks/templates/benchmarks_by_submithost.html b/monitoring/benchmarks/templates/benchmarks_by_submithost.html
new file mode 100644
index 0000000..f2e3f81
--- /dev/null
+++ b/monitoring/benchmarks/templates/benchmarks_by_submithost.html
@@ -0,0 +1,49 @@
+
+
+
+
+
+ Sites publishing benchmark records
+
+
+
+
Sites publishing benchmark records in last 3 months
+
Page last updated: {{ last_fetched|date:"Y-m-d H:i:s.u" }}
+
+
+
Record type
+
Number of Sites
+
+ {% for item in site_counts_by_record_type %}
+
+
{{ item.RecordType }}
+
{{ item.site_count }}
+
+ {% endfor %}
+
+
+
+
+
+
+
Site
+
Submit host
+
Benchmark type
+
Benchmark value
+
Record type
+
Last updated
+
+ {% for benchmark in benchmarks %}
+
+
{{ benchmark.SiteName }}
+
{{ benchmark.SubmitHost }}
+
{{ benchmark.BenchmarkType }}
+
{{ benchmark.BenchmarkValue }}
+
{{ benchmark.RecordType }}
+
{{ benchmark.UpdateTime|date:"Y-m-d H:i:s" }}
+
+ {% endfor %}
+
+
+
+
diff --git a/monitoring/benchmarks/tests.py b/monitoring/benchmarks/tests.py
new file mode 100644
index 0000000..7ce503c
--- /dev/null
+++ b/monitoring/benchmarks/tests.py
@@ -0,0 +1,3 @@
+from django.test import TestCase
+
+# Create your tests here.
diff --git a/monitoring/benchmarks/urls.py b/monitoring/benchmarks/urls.py
new file mode 100644
index 0000000..e3b06cb
--- /dev/null
+++ b/monitoring/benchmarks/urls.py
@@ -0,0 +1,8 @@
+from rest_framework import routers
+
+from monitoring.benchmarks import views
+
+router = routers.SimpleRouter()
+router.register('', views.BenchmarksViewSet)
+
+urlpatterns = router.urls
diff --git a/monitoring/benchmarks/views.py b/monitoring/benchmarks/views.py
new file mode 100644
index 0000000..736d6ba
--- /dev/null
+++ b/monitoring/benchmarks/views.py
@@ -0,0 +1,45 @@
+from django.shortcuts import render
+from datetime import datetime, timedelta
+
+from django.db.models import Max, Count
+from django.db.models.functions import Lower
+
+from rest_framework import viewsets
+from rest_framework.renderers import TemplateHTMLRenderer
+
+
+from monitoring.benchmarks.models import BenchmarksBySubmithost
+
+from monitoring.benchmarks.serializers import BenchmarksBySubmithostSerializer
+
+class BenchmarksViewSet(viewsets.ReadOnlyModelViewSet):
+ # Lower('SiteName'): sorts sites alphabetically, case-insensitively.
+ # '-UpdateTime': sorts records within each site by UpdateTime in descending order (latest first).
+ queryset = BenchmarksBySubmithost.objects.all().order_by(Lower('SiteName'), '-UpdateTime')
+
+ serializer_class = BenchmarksBySubmithostSerializer
+ template_name = 'benchmarks_by_submithost.html'
+
+ def list(self, request):
+ last_fetched = BenchmarksBySubmithost.objects.aggregate(Max('fetched'))['fetched__max']
+ if last_fetched is not None:
+ print(last_fetched.replace(tzinfo=None), datetime.today() - timedelta(hours=1, seconds=20))
+
+ response = super(BenchmarksViewSet, self).list(request)
+
+ # Count number of distinct sites per RecordType
+ site_counts_by_record_type = (
+ BenchmarksBySubmithost.objects
+ .values('RecordType')
+ .annotate(site_count=Count('SiteName', distinct=True))
+ .order_by('RecordType')
+ )
+
+ if type(request.accepted_renderer) is TemplateHTMLRenderer:
+ response.data = {
+ 'benchmarks': response.data,
+ 'last_fetched': last_fetched,
+ 'site_counts_by_record_type': site_counts_by_record_type
+ }
+
+ return response
diff --git a/monitoring/db_update_sqlite.py b/monitoring/db_update_sqlite.py
new file mode 100644
index 0000000..2fc2a77
--- /dev/null
+++ b/monitoring/db_update_sqlite.py
@@ -0,0 +1,322 @@
+# -*- coding: utf-8 -*-
+"""
+`db_update_sqlite.py` - Syncs data from external database into local SQLite DB.
+ - It will be run as a standalone operation via cron.
+"""
+import configparser
+import logging
+import os
+import sys
+
+import django
+from django.db import DatabaseError
+import pandas as pd
+from django.utils.timezone import make_aware, is_naive
+
+
+BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
+
+# Find the root and the Django project
+sys.path.append(BASE_DIR)
+
+# Set up Django settings to run this `db_update_sqlite.py` as standalone file
+os.environ.setdefault("DJANGO_SETTINGS_MODULE", "monitoring.settings")
+
+# Initialize and setup Django
+django.setup()
+
+
+# Cron jobs run in a minimal environment and lack access to Django settings.
+# To ensure proper model imports and database interactions, we MUST initialize and setup Django first.
+from monitoring.publishing.models import (
+ GridSite,
+ CloudSite,
+ GridSiteSync,
+ VAnonCloudRecord,
+ VSuperSummaries,
+ VSyncRecords
+)
+
+from monitoring.publishing.views import (
+ summaries_dict_standard,
+ syncrecords_dict_standard,
+ correct_dict,
+ fill_summaries_dict,
+ fill_syncrecords_dict,
+ get_year_month_str
+)
+
+from monitoring.benchmarks.models import (
+ BenchmarksBySubmithost,
+ VJobRecords,
+ VSummaries,
+ VNormalisedSummaries,
+)
+
+try:
+ # Read configuration from the file
+ cp = configparser.ConfigParser(interpolation=None)
+ file_path = os.path.join(BASE_DIR, 'monitoring', 'settings.ini')
+ cp.read(file_path)
+
+except (configparser.NoSectionError) as err:
+ print("Error in configuration file. Check that file exists first: %s" % err)
+ sys.exit(1)
+
+# Set up basic logging config
+logging.basicConfig(
+ filename=cp.get('common', 'logfile'),
+ level=logging.INFO,
+ format='%(asctime)s - %(levelname)s - %(message)s'
+)
+
+# set up the logger
+log = logging.getLogger(__name__)
+
+
+def determine_sync_status(f):
+ """
+ Helper to determine sync status between published and the database record counts.
+ """
+ RecordCountPublished = f.get("RecordCountPublished")
+ RecordCountInDb = f.get("RecordCountInDb")
+ rel_diff1 = abs(RecordCountPublished - RecordCountInDb)/RecordCountInDb
+ rel_diff2 = abs(RecordCountPublished - RecordCountInDb)/RecordCountPublished
+ if rel_diff1 < 0.01 or rel_diff2 < 0.01:
+ syncstatus = "OK"
+ else:
+ syncstatus = "ERROR [ Please use the Gap Publisher to synchronise this dataset]"
+ return syncstatus
+
+
+def refresh_gridsite():
+ try:
+ sql_query = """
+ SELECT
+ Site,
+ max(LatestEndTime) AS LatestPublish
+ FROM VSuperSummaries
+ WHERE LatestEndTime > DATE_SUB(NOW(), INTERVAL 1 YEAR)
+ GROUP BY 1;
+ """
+ fetchset = VSuperSummaries.objects.using('grid').raw(sql_query)
+
+ for f in fetchset:
+ GridSite.objects.update_or_create(
+ defaults={'updated': f.LatestPublish},
+ SiteName=f.Site
+ )
+
+ log.info("Refreshed GridSite")
+
+ except DatabaseError:
+ log.exception('Error while trying to refresh GridSite')
+
+
+def refresh_cloudsite():
+ try:
+ sql_query = """
+ SELECT
+ b.SiteName,
+ COUNT(DISTINCT VMUUID) as VMs,
+ CloudType,
+ b.UpdateTime
+ FROM(
+ SELECT
+ SiteName,
+ MAX(UpdateTime) AS latest
+ FROM VAnonCloudRecords
+ WHERE UpdateTime > DATE_SUB(NOW(), INTERVAL 1 YEAR)
+ GROUP BY SiteName
+ )
+ AS a
+ INNER JOIN VAnonCloudRecords
+ AS b
+ ON b.SiteName = a.SiteName AND b.UpdateTime = a.latest
+ GROUP BY SiteName;
+ """
+ fetchset = VAnonCloudRecord.objects.using('cloud').raw(sql_query)
+
+ for f in fetchset:
+ CloudSite.objects.update_or_create(
+ defaults={
+ 'Vms': f.VMs,
+ 'Script': f.CloudType,
+ 'updated': f.UpdateTime
+ },
+ SiteName=f.SiteName
+ )
+
+ log.info("Refreshed CloudSite")
+
+ except DatabaseError:
+ log.exception('Error while trying to refresh CloudSite')
+
+
+def refresh_gridsitesync():
+ try:
+ # The condition on EarliestEndTime and LatestEndTime is necessary to avoid error by pytz because of dates like '00-00-00'
+ sql_query_summaries = """
+ SELECT
+ Site,
+ Month,
+ Year,
+ SUM(NumberOfJobs) AS RecordCountPublished,
+ MIN(EarliestEndTime) AS RecordStart,
+ MAX(LatestEndTime) AS RecordEnd
+ FROM VSuperSummaries
+ WHERE
+ Year >= YEAR(NOW()) - 2 AND
+ EarliestEndTime>'1900-01-01' AND
+ LatestEndTime>'1900-01-01'
+ GROUP BY
+ Site, Year, Month;
+ """
+ fetchset_Summaries = VSuperSummaries.objects.using('grid').raw(sql_query_summaries)
+
+ sql_query_syncrec = """
+ SELECT
+ Site,
+ Month,
+ Year,
+ SUM(NumberOfJobs) AS RecordCountInDb
+ FROM VSyncRecords
+ WHERE
+ Year >= YEAR(NOW()) - 2
+ GROUP BY
+ Site, Year, Month;
+ """
+ fetchset_SyncRecords = VSyncRecords.objects.using('grid').raw(sql_query_syncrec)
+
+ # Create empty dicts that will become dfs to be combined
+ summaries_dict = summaries_dict_standard.copy()
+ syncrecords_dict = syncrecords_dict_standard.copy()
+
+ # Fill the dicts with the fetched data
+ for row in fetchset_Summaries:
+ summaries_dict = fill_summaries_dict(summaries_dict, row)
+ summaries_dict = correct_dict(summaries_dict)
+ for row in fetchset_SyncRecords:
+ syncrecords_dict = fill_syncrecords_dict(syncrecords_dict, row)
+ syncrecords_dict = correct_dict(syncrecords_dict)
+
+ # Merge data from VSuperSummaries and VSyncRecords into one df
+ df_Summaries = pd.DataFrame.from_dict(summaries_dict)
+ df_SyncRecords = pd.DataFrame.from_dict(syncrecords_dict)
+ df_all = df_Summaries.merge(
+ df_SyncRecords,
+ left_on=['Site', 'Month', 'Year'],
+ right_on=['Site', 'Month', 'Year'],
+ how='inner'
+ )
+ fetchset = df_all.to_dict('index')
+
+ # Determine SyncStatus based on the difference between records published and in db
+ for f in fetchset.values():
+ f['SyncStatus'] = determine_sync_status(f)
+
+ # Combined primary keys outside the default dict
+ GridSiteSync.objects.update_or_create(
+ defaults={
+ 'RecordStart': f.get("RecordStart"),
+ 'RecordEnd': f.get("RecordEnd"),
+ 'RecordCountPublished': f.get("RecordCountPublished"),
+ 'RecordCountInDb': f.get("RecordCountInDb"),
+ 'SyncStatus': f.get("SyncStatus"),
+ },
+ YearMonth=get_year_month_str(f.get("Year"), f.get("Month")),
+ SiteName=f.get("Site"),
+ Month=f.get("Month"),
+ Year=f.get("Year"),
+ )
+ log.info("Refreshed GridSiteSync")
+
+ except DatabaseError:
+ log.exception('Error while trying to refresh GridSiteSync')
+
+
+def refresh_BenchmarksBySubmitHost():
+ views = ['VSummaries', 'VJobRecords', 'VNormalisedSummaries']
+ for view in views:
+ refresh_BenchmarksBySubmitHost_from_view(view)
+
+
+def refresh_BenchmarksBySubmitHost_from_view(view_name):
+ try:
+ if view_name == 'VSummaries':
+ sql_query = f"""
+ SELECT
+ Site,
+ SubmitHost,
+ ServiceLevelType,
+ ServiceLevel,
+ max(UpdateTime) AS LatestPublish
+ FROM {view_name}
+ WHERE UpdateTime > DATE_SUB(NOW(), INTERVAL 3 MONTH)
+ GROUP BY Site, SubmitHost;
+ """
+ elif view_name == 'VJobRecords':
+ sql_query = f"""
+ SELECT
+ Site,
+ SubmitHost,
+ ServiceLevelType,
+ ServiceLevel,
+ max(UpdateTime) AS LatestPublish
+ FROM {view_name}
+ WHERE EndTime > DATE_SUB(NOW(), INTERVAL 3 MONTH)
+ AND UpdateTime > DATE_SUB(NOW(), INTERVAL 3 MONTH)
+ GROUP BY Site, SubmitHost;
+ """
+ elif view_name == 'VNormalisedSummaries':
+ sql_query = f"""
+ SELECT
+ Site,
+ SubmitHost,
+ ServiceLevelType,
+ (NormalisedWallDuration / WallDuration) AS ServiceLevel,
+ max(UpdateTime) AS LatestPublish
+ FROM {view_name}
+ WHERE UpdateTime > DATE_SUB(NOW(), INTERVAL 3 MONTH)
+ AND WallDuration > 0
+ GROUP BY Site, SubmitHost;
+ """
+ else:
+ log.warning(f"Unknown view name: {view_name}")
+ return
+
+ # Dynamically get the model class from globals
+ model_class = globals()[view_name]
+ fetchset = model_class.objects.using('grid').raw(sql_query)
+
+ for f in fetchset:
+ BenchmarksBySubmithost.objects.update_or_create(
+ defaults={
+ 'UpdateTime': make_aware(f.LatestPublish) if is_naive(f.LatestPublish) else f.LatestPublish,
+ 'RecordType': model_class._meta.verbose_name
+ },
+ SiteName=f.Site,
+ SubmitHost=f.SubmitHost,
+ BenchmarkType=f.ServiceLevelType,
+ BenchmarkValue=f.ServiceLevel,
+ )
+
+ log.info(f"Refreshed BenchmarksBySubmitHost from {view_name}")
+
+ except Exception:
+ log.exception(f'Error while trying to refresh BenchmarksBySubmitHost from {view_name}')
+
+
+if __name__ == "__main__":
+ log.info('=====================')
+
+ refresh_gridsite()
+ refresh_cloudsite()
+ refresh_gridsitesync()
+ refresh_BenchmarksBySubmitHost()
+
+ log.info(
+ "Data retrieval and processing attempted. "
+ "Check the above logs for details on the sync status"
+ )
+ log.info('=====================')
diff --git a/monitoring/publishing/apps.py b/monitoring/publishing/apps.py
index 57c17f7..d5b06c3 100644
--- a/monitoring/publishing/apps.py
+++ b/monitoring/publishing/apps.py
@@ -5,4 +5,4 @@
class PublishingConfig(AppConfig):
- name = 'publishing'
+ name = 'monitoring.publishing'
diff --git a/monitoring/publishing/models.py b/monitoring/publishing/models.py
index ba65c9a..195872d 100644
--- a/monitoring/publishing/models.py
+++ b/monitoring/publishing/models.py
@@ -4,18 +4,67 @@
from django.db import models
+class GridSite(models.Model):
+ fetched = models.DateTimeField(auto_now=True)
+ SiteName = models.CharField(max_length=255, primary_key=True)
+ updated = models.DateTimeField()
+
+
+class VSuperSummaries(models.Model):
+ Site = models.CharField(max_length=255, primary_key=True)
+ LatestPublish = models.DateTimeField()
+ Month = models.IntegerField()
+ Year = models.IntegerField()
+ RecordStart = models.DateTimeField()
+ RecordEnd = models.DateTimeField()
+ RecordCountPublished = models.IntegerField()
+
+ class Meta:
+ managed = False
+ db_table = 'VSuperSummaries'
+
+
+class GridSiteSync(models.Model):
+ fetched = models.DateTimeField(auto_now=True)
+ SiteName = models.CharField(max_length=255)
+ YearMonth = models.CharField(max_length=255)
+ Year = models.IntegerField()
+ Month = models.IntegerField()
+ RecordStart = models.DateTimeField()
+ RecordEnd = models.DateTimeField()
+ RecordCountPublished = models.IntegerField()
+ RecordCountInDb = models.IntegerField()
+ SyncStatus = models.CharField(max_length=255)
+
+ class Meta:
+ # Descending order of Year and Month to display latest data first
+ ordering = ('SiteName', '-Year', '-Month')
+ unique_together = ('SiteName', 'YearMonth')
+
+
+class VSyncRecords(models.Model):
+ Site = models.CharField(max_length=255, primary_key=True)
+ RecordCountInDb = models.IntegerField()
+
+ class Meta:
+ managed = False
+ db_table = 'VSyncRecords'
+
+
class CloudSite(models.Model):
fetched = models.DateTimeField(auto_now=True)
- name = models.CharField(max_length=255, primary_key=True)
- script = models.CharField(max_length=255)
+ SiteName = models.CharField(max_length=255, primary_key=True)
+ Vms = models.IntegerField(default=0)
+ Script = models.CharField(max_length=255)
updated = models.DateTimeField()
class Meta:
- ordering = ('name',)
+ ordering = ('SiteName',)
class VAnonCloudRecord(models.Model):
SiteName = models.CharField(max_length=255, primary_key=True)
+ VMs = models.IntegerField()
CloudType = models.CharField(max_length=255)
UpdateTime = models.DateTimeField()
@@ -24,6 +73,25 @@ class Meta:
db_table = 'vanoncloudrecords'
def __str__(self):
- return '%s, running "%s", updated at %s' % (self.SiteName,
+ return '%s running "%s" updated at %s with %s records' % (
+ self.SiteName,
self.CloudType,
- self.UpdateTime)
+ self.UpdateTime,
+ self.VMs)
+
+
+class GridSiteSyncSubmitH(models.Model):
+ fetched = models.DateTimeField(auto_now=True)
+ SiteName = models.CharField(max_length=255)
+ YearMonth = models.CharField(max_length=255)
+ Year = models.IntegerField()
+ Month = models.IntegerField()
+ RecordStart = models.DateTimeField()
+ RecordEnd = models.DateTimeField()
+ RecordCountPublished = models.IntegerField()
+ RecordCountInDb = models.IntegerField()
+ SubmitHost = models.CharField(max_length=255)
+
+ class Meta:
+ ordering = ('SiteName', '-Year', '-Month')
+ unique_together = ('SiteName', 'YearMonth', 'SubmitHost')
diff --git a/monitoring/publishing/serializers.py b/monitoring/publishing/serializers.py
index 40309bb..b273462 100644
--- a/monitoring/publishing/serializers.py
+++ b/monitoring/publishing/serializers.py
@@ -1,8 +1,119 @@
from rest_framework import serializers
+from rest_framework.reverse import reverse
+
+from monitoring.publishing.models import (
+ CloudSite,
+ GridSite,
+ GridSiteSync,
+ GridSiteSyncSubmitH
+)
+
+
+class GridSiteSerializer(serializers.HyperlinkedModelSerializer):
+ # Override default format with None so that Python datetime is used as
+ # ouput format. Encoding will be determined by the renderer and can be
+ # formatted by a template filter.
+ updated = serializers.DateTimeField(format=None)
+
+ class Meta:
+ model = GridSite
+ fields = (
+ 'url',
+ 'SiteName',
+ 'updated'
+ )
+
+ # Sitename substitutes for pk
+ extra_kwargs = {
+ 'url': {'view_name': 'gridsite-detail', 'lookup_field': 'SiteName'}
+ }
+
+
+class GridSiteSyncSerializer(serializers.HyperlinkedModelSerializer):
+ # Override default format with None so that Python datetime is used as
+ # ouput format. Encoding will be determined by the renderer and can be
+ # formatted by a template filter.
+
+ class Meta:
+ model = GridSiteSync
+ fields = (
+ 'url',
+ 'SiteName',
+ 'YearMonth',
+ 'RecordStart',
+ 'RecordEnd',
+ 'RecordCountPublished',
+ 'RecordCountInDb',
+ 'SyncStatus'
+ )
+
+ # Sitename substitutes for pk
+ extra_kwargs = {
+ 'url': {'view_name': 'gridsitesync-detail', 'lookup_field': 'SiteName'}
+ }
-from models import CloudSite
class CloudSiteSerializer(serializers.HyperlinkedModelSerializer):
+ # Override default format with None so that Python datetime is used as
+ # ouput format. Encoding will be determined by the renderer and can be
+ # formatted by a template filter.
+ updated = serializers.DateTimeField(format=None)
+
class Meta:
model = CloudSite
- fields = ('url', 'name', 'script', 'updated')
+ fields = (
+ 'url',
+ 'SiteName',
+ 'Vms',
+ 'Script',
+ 'updated'
+ )
+
+ # Sitename substitutes for pk
+ extra_kwargs = {
+ 'url': {'view_name': 'cloudsite-detail', 'lookup_field': 'SiteName'}
+ }
+
+
+class MultipleFieldLookup(serializers.HyperlinkedIdentityField):
+ # HyperlinkedModelSerializer seems to NOT able to work with two lookup_fields
+ # This class is ONLY capable to match object instance to its URL representation.
+ # i.e, `SiteName` and `YearMonth` ONLY
+ #
+ # Overriding the get_url() method - To match object instance to its URL representation.
+ def get_url(self, obj, view_name, request, format):
+ if not obj.SiteName or not obj.YearMonth:
+ return None
+
+ return request.build_absolute_uri(
+ reverse(
+ view_name,
+ kwargs={
+ 'SiteName': obj.SiteName,
+ 'YearMonth': obj.YearMonth
+ },
+ request=request,
+ format=format
+ ))
+
+
+class GridSiteSyncSubmitHSerializer(serializers.HyperlinkedModelSerializer):
+ # Override default format with None so that Python datetime is used as
+ # ouput format. Encoding will be determined by the renderer and can be
+ # formatted by a template filter.
+
+ # This helps us to match or construct the absolute URL based on the `SiteName` and `YearMonth`
+ url = MultipleFieldLookup(view_name='gridsync-submithost')
+
+ class Meta:
+ model = GridSiteSyncSubmitH
+ fields = (
+ 'url',
+ 'SiteName',
+ 'YearMonth',
+ 'RecordStart',
+ 'RecordEnd',
+ 'RecordCountPublished',
+ 'RecordCountInDb',
+ 'SubmitHost'
+ )
diff --git a/monitoring/publishing/static/style.css b/monitoring/publishing/static/style.css
new file mode 100644
index 0000000..85a5cf9
--- /dev/null
+++ b/monitoring/publishing/static/style.css
@@ -0,0 +1,53 @@
+/*- Menu Tabs E--------------------------- */
+
+ #tabsE {
+ float:left;
+ width:100%;
+ background:#333;
+ font-size:93%;
+ line-height:normal;
+
+ }
+ #tabsE ul {
+ margin:0;
+ padding:10px 10px 0 50px;
+ list-style:none;
+ }
+ #tabsE li {
+ display:inline;
+ margin:0;
+ padding:0;
+ }
+ #tabsE a {
+ float:left;
+ background:url("tableftE.gif") no-repeat left top;
+ margin:0;
+ padding:0 0 0 4px;
+ text-decoration:none;
+ }
+ #tabsE a span {
+ float:left;
+ display:block;
+ background:url("tabrightE.gif") no-repeat right top;
+ padding:5px 15px 4px 6px;
+ color:#fff;
+ }
+ /* Commented Backslash Hack hides rule from IE5-Mac \*/
+ #tabsE a span {float:none;}
+ /* End IE5-Mac hack */
+ #tabsE a:hover span {
+ color:#FFF;
+ }
+ #tabsE a:hover {
+ background-position:0% -42px;
+ }
+ #tabsE a:hover span {
+ background-position:100% -42px;
+ }
+
+ #tabsE #current a {
+ background-position:0% -42px;
+ }
+ #tabsE #current a span {
+ background-position:100% -42px;
+ }
diff --git a/monitoring/publishing/static/stylesheet.css b/monitoring/publishing/static/stylesheet.css
new file mode 100644
index 0000000..11ce5f0
--- /dev/null
+++ b/monitoring/publishing/static/stylesheet.css
@@ -0,0 +1,184 @@
+h12 {
+ font-size: 22px;
+ color: #336699;
+ background-color: #FFFFFF;
+ font-family: Arial, Helvetica, sans-serif;
+ font-weight: bold
+}
+h1 {
+ font-size: 1.2em;
+ color: #336699;
+ background-color: #FFFFFF;
+ font-family: Arial, Helvetica, sans-serif;
+}
+h2 {
+ font-size: 1em;
+}
+
+
+h6 {
+ align: right;
+ font-size: 0.6em;
+}
+body {
+ font-family: Arial, Helvetica, sans-serif;
+ background-color: #FFFFFF;
+ color: #000000;
+ font-size: .5px;
+ font-size: 14px;
+}
+th {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 12px;
+ color: #009999;
+ background-color: #FFFFFF;
+ font-weight: bold;
+ align: left;
+}
+a:link {
+ text-decoration: none;
+ color: #336699;
+ font-weight: bold;
+}
+a:active {
+ text-decoration: none;
+ color: #336699;
+ font-weight: bold;
+ background-color: #FFFFFF;
+}
+a:visited {
+ text-decoration: none;
+ color: #996699;
+ font-weight: bold;
+}
+a:hover {
+ color : #666666;
+ background-color: #CCCCCC;
+ font-weight: bold;
+}
+hr {
+ color: #666666;
+ background-color: #FFFFFF;
+}
+.outlined {
+ border: 1px solid #000000;
+}
+.navbar-title {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #FFFFFF;
+ font-weight: bold;
+
+
+}
+.navbar {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 0.6em;
+ background-color: #FFFFFF;
+
+}
+p {
+ font-family: Arial, Helvetica, sans-serif;
+ color: #000000;
+ font-size: 14px;
+ font-weight: normal;
+}
+li {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #111111;
+ list-style-type: square;
+ font-weight: normal;
+}
+.sidebar-orange {
+ background-color: #ffe5b2;
+ background-image: url(images/orange-globe.jpg);
+ background-position: right center;
+ background-repeat: no-repeat;
+
+}
+.sidebar-green {
+ background-image: url(images/green-pulse.jpg);
+ background-repeat: repeat-x;
+ background-position: center bottom;
+ background-color: #dcf6de;
+}
+.sidebar-blue {
+ background-color: #dcf4f6;
+ background-image: url(images/blue-bars.jpg);
+ background-repeat: no-repeat;
+ background-position: right bottom;
+}
+.sidebar-pink {
+ background-color: #efdcf6;
+ background-image: url(images/pink-news.jpg);
+ background-repeat: no-repeat;
+ background-position: left bottom;
+}
+.note {
+ background-color: #E4E4E4;
+ border: 1px dotted #000000;
+ padding: 5px;
+}
+.tabletext {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #000000;
+ background-color: #DDDDDD;
+}
+.tabletextwarning {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #000000;
+ background-color: #FFFF00;
+}
+.tabletextok {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #000000;
+ background-color: #00FF00;
+}
+.tabletexterror {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #000000;
+ background-color: #FF0000;
+}
+.tabletextinfo {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #000000;
+ background-color: #00CCFF;
+}
+.tableheader {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ color: #FFFFFF;
+ background-color: #000000;
+ font-weight: bold;
+ text-align: center;
+}
+.navbar-heading {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 1em;
+ background-color: #FFFFFF;
+ font-weight: bold;
+ color: #000000;
+}
+.feintoutlined {
+ border: 1px dotted #CCCCCC;
+}
+.newsHeader {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 0.9em;
+ background-color: #97A6CD;
+ font-weight: bold;
+}
+.newsBody {
+ font-family: Arial, Helvetica, sans-serif;
+ font-size: 0.7em;
+ background-color: #ACBBDA;
+}
+.invisibleBorder {
+ border: thin dashed #EEEEEE;
+}
diff --git a/monitoring/publishing/templates/cloudsites.html b/monitoring/publishing/templates/cloudsites.html
index e146485..0ffc1ef 100644
--- a/monitoring/publishing/templates/cloudsites.html
+++ b/monitoring/publishing/templates/cloudsites.html
@@ -1,19 +1,23 @@
+
+
+
Sites publishing cloud accounting records
-
Sites publishing cloud accounting records from 2018-06-19 onwards
-
Page last updated: {{ last_fetched|date:"c" }}
+
Sites publishing cloud accounting records in the last year