Compare commits

..

No commits in common. "master" and "1.16.0" have entirely different histories.

253 changed files with 11393 additions and 27591 deletions

View File

@ -1,7 +1,6 @@
/.idea /.idea
/node_modules /node_modules
/data /data
/cypress
/out /out
/test /test
/kubernetes /kubernetes
@ -31,9 +30,6 @@ tsconfig.json
/tmp /tmp
/babel.config.js /babel.config.js
/ecosystem.config.js /ecosystem.config.js
/extra/healthcheck.exe
/extra/healthcheck
### .gitignore content (commented rules are duplicated) ### .gitignore content (commented rules are duplicated)

View File

@ -19,6 +19,3 @@ indent_size = 2
[*.vue] [*.vue]
trim_trailing_whitespace = false trim_trailing_whitespace = false
[*.go]
indent_style = tab

View File

@ -1,9 +1,3 @@
⚠️⚠️⚠️ Since we do not accept all types of pull requests and do not want to waste your time. Please be sure that you have read pull request rules:
https://github.com/louislam/uptime-kuma/blob/master/CONTRIBUTING.md#can-i-create-a-pull-request-for-uptime-kuma
Tick the checkbox if you understand [x]:
- [ ] I have read and understand the pull request rules.
# Description # Description
Fixes #(issue) Fixes #(issue)

View File

@ -18,7 +18,7 @@ jobs:
strategy: strategy:
matrix: matrix:
os: [macos-latest, ubuntu-latest, windows-latest] os: [macos-latest, ubuntu-latest, windows-latest]
node: [ 14, 16, 18, 19 ] node: [ 14, 16, 17, 18 ]
# See supported Node.js release schedule at https://nodejs.org/en/about/releases/ # See supported Node.js release schedule at https://nodejs.org/en/about/releases/
steps: steps:
@ -50,35 +50,3 @@ jobs:
cache: 'npm' cache: 'npm'
- run: npm install - run: npm install
- run: npm run lint - run: npm run lint
e2e-tests:
needs: [ check-linters ]
runs-on: ubuntu-latest
steps:
- run: git config --global core.autocrlf false # Mainly for Windows
- uses: actions/checkout@v3
- name: Use Node.js 14
uses: actions/setup-node@v3
with:
node-version: 14
cache: 'npm'
- run: npm install
- run: npm run build
- run: npm run cy:test
frontend-unit-tests:
needs: [ check-linters ]
runs-on: ubuntu-latest
steps:
- run: git config --global core.autocrlf false # Mainly for Windows
- uses: actions/checkout@v3
- name: Use Node.js 14
uses: actions/setup-node@v3
with:
node-version: 14
cache: 'npm'
- run: npm install
- run: npm run build
- run: npm run cy:run:unit

View File

@ -1,22 +0,0 @@
name: 'Automatically close stale issues and PRs'
on:
workflow_dispatch:
schedule:
- cron: '0 */6 * * *'
#Run every 6 hours
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v5
with:
stale-issue-message: 'We are clearing up our old issues and your ticket has been open for 3 months with no activity. Remove stale label or comment or this will be closed in 2 days.'
close-issue-message: 'This issue was closed because it has been stalled for 2 days with no activity.'
days-before-stale: 90
days-before-close: 2
days-before-pr-stale: 999999999
days-before-pr-close: 1
exempt-issue-labels: 'News,Medium,High,discussion,bug,doc,feature-request'
exempt-issue-assignees: 'louislam'
operations-per-run: 200

7
.gitignore vendored
View File

@ -13,10 +13,3 @@ dist-ssr
/out /out
/tmp /tmp
.env .env
cypress/videos
cypress/screenshots
/extra/healthcheck.exe
/extra/healthcheck
/extra/healthcheck-armv7

View File

@ -8,7 +8,6 @@
"declaration-empty-line-before": null, "declaration-empty-line-before": null,
"alpha-value-notation": "number", "alpha-value-notation": "number",
"color-function-notation": "legacy", "color-function-notation": "legacy",
"shorthand-property-no-redundant-values": null, "shorthand-property-no-redundant-values": null
"color-hex-length": null,
} }
} }

View File

@ -1,6 +1,6 @@
# Project Info # Project Info
First of all, I want to thank everyone who made pull requests for Uptime Kuma. I never thought the GitHub Community would be so nice! Because of this, I also never thought that other people would actually read and edit my code. It is not very well structured or commented, sorry about that. First of all, thank you everyone who made pull requests for Uptime Kuma, I never thought GitHub Community can be that nice! And also because of this, I also never thought other people actually read my code and edit my code. It is not structured and commented so well, lol. Sorry about that.
The project was created with vite.js (vue3). Then I created a subdirectory called "server" for server part. Both frontend and backend share the same package.json. The project was created with vite.js (vue3). Then I created a subdirectory called "server" for server part. Both frontend and backend share the same package.json.
@ -27,39 +27,17 @@ The frontend code build into "dist" directory. The server (express.js) exposes t
## Can I create a pull request for Uptime Kuma? ## Can I create a pull request for Uptime Kuma?
Yes or no, it depends on what you will try to do. Since I don't want to waste your time, be sure to **create an empty draft pull request or open an issue, so we can have a discussion first**. Especially for a large pull request or you don't know it will be merged or not. (Updated 2022-04-24) Since I don't want to waste your time, be sure to create empty draft pull request, so we can discuss first.
Here are some references: ✅ Accept:
✅ Usually Accept:
- Bug/Security fix - Bug/Security fix
- Translations - Translations
- Adding notification providers - Adding notification providers
⚠️ Discussion First ⚠️ Discuss First
- Large pull requests - Large pull requests
- New features - New features
❌ Won't Merge
- Do not pass auto test
- Any breaking changes
- Duplicated pull request
- Buggy
- UI/UX is not close to Uptime Kuma
- Existing logic is completely modified or deleted for no reason
- A function that is completely out of scope
- Convert existing code into other programming languages
- Unnecessary large code changes (Hard to review, causes code conflicts to other pull requests)
The above cases cannot cover all situations.
I (@louislam) have the final say. If your pull request does not meet my expectations, I will reject it, no matter how much time you spend on it. Therefore, it is essential to have a discussion beforehand.
I will mark your pull request in the [milestones](https://github.com/louislam/uptime-kuma/milestones), if I am plan to review and merge it.
Also, please don't rush or ask for ETA, because I have to understand the pull request, make sure it is no breaking changes and stick to my vision of this project, especially for large pull requests.
### Recommended Pull Request Guideline ### Recommended Pull Request Guideline
Before deep into coding, discussion first is preferred. Creating an empty pull request for discussion would be recommended. Before deep into coding, discussion first is preferred. Creating an empty pull request for discussion would be recommended.
@ -75,15 +53,22 @@ Before deep into coding, discussion first is preferred. Creating an empty pull r
1. Click "Change to draft" 1. Click "Change to draft"
1. Discussion 1. Discussion
#### ❌ Won't Merge
- Any breaking changes
- Duplicated pull request
- Buggy
- Existing logic is completely modified or deleted
- A function that is completely out of scope
## Project Styles ## Project Styles
I personally do not like it when something requires so much learning and configuration before you can finally start the app. I personally do not like something need to learn so much and need to config so much before you can finally start the app.
- Easy to install for non-Docker users, no native build dependency is needed (at least for x86_64), no extra config, no extra effort required to get it running - Easy to install for non-Docker users, no native build dependency is needed (at least for x86_64), no extra config, no extra effort to get it run
- Single container for Docker users, no very complex docker-compose file. Just map the volume and expose the port, then good to go - Single container for Docker users, no very complex docker-compose file. Just map the volume and expose the port, then good to go
- Settings should be configurable in the frontend. Environment variable is not encouraged, unless it is related to startup such as `DATA_DIR`. - Settings should be configurable in the frontend. Env var is not encouraged.
- Easy to use - Easy to use
- The web UI styling should be consistent and nice.
## Coding Styles ## Coding Styles
@ -95,8 +80,8 @@ I personally do not like it when something requires so much learning and configu
## Name convention ## Name convention
- Javascript/Typescript: camelCaseType - Javascript/Typescript: camelCaseType
- SQLite: snake_case (Underscore) - SQLite: underscore_type
- CSS/SCSS: kebab-case (Dash) - CSS/SCSS: dash-type
## Tools ## Tools
@ -177,23 +162,16 @@ The data and socket logic are in `src/mixins/socket.js`.
## Unit Test ## Unit Test
It is an end-to-end testing. It is using Jest and Puppeteer.
```bash ```bash
npm run build npm run build
npm test npm test
``` ```
## Dependencies By default, the Chromium window will be shown up during the test. Specifying `HEADLESS_TEST=1` for terminal environments.
Both frontend and backend share the same package.json. However, the frontend dependencies are eventually not used in the production environment, because it is usually also baked into dist files. So: ## Update Dependencies
- Frontend dependencies = "devDependencies"
- Examples: vue, chart.js
- Backend dependencies = "dependencies"
- Examples: socket.io, sqlite3
- Development dependencies = "devDependencies"
- Examples: eslint, sass
### Update Dependencies
Install `ncu` Install `ncu`
https://github.com/raineorshine/npm-check-updates https://github.com/raineorshine/npm-check-updates

View File

@ -7,32 +7,33 @@
<img src="./public/icon.svg" width="128" alt="" /> <img src="./public/icon.svg" width="128" alt="" />
</div> </div>
Uptime Kuma is an easy-to-use self-hosted monitoring tool. It is a self-hosted monitoring tool like "Uptime Robot".
<img src="https://user-images.githubusercontent.com/1336778/212262296-e6205815-ad62-488c-83ec-a5b0d0689f7c.jpg" width="700" alt="" /> <img src="https://uptime.kuma.pet/img/dark.jpg" width="700" alt="" />
## 🥔 Live Demo ## 🥔 Live Demo
Try it! Try it!
- Tokyo Demo Server: https://demo.uptime.kuma.pet (Sponsored by [Uptime Kuma Sponsors](https://github.com/louislam/uptime-kuma#%EF%B8%8F-sponsors)) https://demo.uptime.kuma.pet
- Europe Demo Server: https://demo.uptime-kuma.karimi.dev:27000 (Provided by [@mhkarimi1383](https://github.com/mhkarimi1383))
It is a temporary live demo, all data will be deleted after 10 minutes. Use the one that is closer to you, but I suggest that you should install and try it out for the best demo experience. It is a temporary live demo, all data will be deleted after 10 minutes. The server is located in Tokyo, so if you live far from there, it may affect your experience. I suggest that you should install and try it out for the best demo experience.
VPS is sponsored by Uptime Kuma sponsors on [Open Collective](https://opencollective.com/uptime-kuma)! Thank you so much!
## ⭐ Features ## ⭐ Features
* Monitoring uptime for HTTP(s) / TCP / HTTP(s) Keyword / Ping / DNS Record / Push / Steam Game Server / Docker Containers * Monitoring uptime for HTTP(s) / TCP / HTTP(s) Keyword / Ping / DNS Record / Push / Steam Game Server.
* Fancy, Reactive, Fast UI/UX * Fancy, Reactive, Fast UI/UX.
* Notifications via Telegram, Discord, Gotify, Slack, Pushover, Email (SMTP), and [90+ notification services, click here for the full list](https://github.com/louislam/uptime-kuma/tree/master/src/components/notifications) * Notifications via Telegram, Discord, Gotify, Slack, Pushover, Email (SMTP), and [90+ notification services, click here for the full list](https://github.com/louislam/uptime-kuma/tree/master/src/components/notifications).
* 20 second intervals * 20 second intervals.
* [Multi Languages](https://github.com/louislam/uptime-kuma/tree/master/src/languages) * [Multi Languages](https://github.com/louislam/uptime-kuma/tree/master/src/languages)
* Multiple status pages * Multiple Status Pages
* Map status pages to specific domains * Map Status Page to Domain
* Ping chart * Ping Chart
* Certificate info * Certificate Info
* Proxy support * Proxy Support
* 2FA support * 2FA available
## 🔧 How to Install ## 🔧 How to Install
@ -44,14 +45,14 @@ docker run -d --restart=always -p 3001:3001 -v uptime-kuma:/app/data --name upti
⚠️ Please use a **local volume** only. Other types such as NFS are not supported. ⚠️ Please use a **local volume** only. Other types such as NFS are not supported.
Uptime Kuma is now running on http://localhost:3001 Browse to http://localhost:3001 after starting.
### 💪🏻 Non-Docker ### 💪🏻 Non-Docker
Required Tools: Required Tools:
- [Node.js](https://nodejs.org/en/download/) >= 14 - [Node.js](https://nodejs.org/en/download/) >= 14
- [Git](https://git-scm.com/downloads) - [Git](https://git-scm.com/downloads)
- [pm2](https://pm2.keymetrics.io/) - For running Uptime Kuma in the background - [pm2](https://pm2.keymetrics.io/) - For run in background
```bash ```bash
# Update your npm to the latest version # Update your npm to the latest version
@ -73,7 +74,7 @@ pm2 start server/server.js --name uptime-kuma
``` ```
Uptime Kuma is now running on http://localhost:3001 Browse to http://localhost:3001 after starting.
More useful PM2 Commands More useful PM2 Commands
@ -105,7 +106,7 @@ https://github.com/louislam/uptime-kuma/milestones
Project Plan: Project Plan:
https://github.com/users/louislam/projects/4/views/1 https://github.com/louislam/uptime-kuma/projects/1
## ❤️ Sponsors ## ❤️ Sponsors
@ -150,20 +151,13 @@ You can discuss or ask for help in [issues](https://github.com/louislam/uptime-k
### Subreddit ### Subreddit
My Reddit account: [u/louislamlam](https://reddit.com/u/louislamlam). My Reddit account: louislamlam
You can mention me if you ask a question on Reddit. You can mention me if you ask a question on Reddit.
[r/Uptime kuma](https://www.reddit.com/r/UptimeKuma/) https://www.reddit.com/r/UptimeKuma/
## Contribute ## Contribute
### Test Pull Requests ### Beta Version
There are a lot of pull requests right now, but I don't have time to test them all.
If you want to help, you can check this:
https://github.com/louislam/uptime-kuma/wiki/Test-Pull-Requests
### Test Beta Version
Check out the latest beta release here: https://github.com/louislam/uptime-kuma/releases Check out the latest beta release here: https://github.com/louislam/uptime-kuma/releases
@ -175,5 +169,5 @@ If you want to translate Uptime Kuma into your language, please read: https://gi
Feel free to correct my grammar in this README, source code, or wiki, as my mother language is not English and my grammar is not that great. Feel free to correct my grammar in this README, source code, or wiki, as my mother language is not English and my grammar is not that great.
### Create Pull Requests ### Pull Requests
If you want to modify Uptime Kuma, please read this guide and follow the rules here: https://github.com/louislam/uptime-kuma/blob/master/CONTRIBUTING.md If you want to modify Uptime Kuma, this guideline may be useful for you: https://github.com/louislam/uptime-kuma/blob/master/CONTRIBUTING.md

View File

@ -2,12 +2,15 @@
## Reporting a Vulnerability ## Reporting a Vulnerability
Please report security issues to https://github.com/louislam/uptime-kuma/security/advisories/new. Please report security issues to uptime@kuma.pet.
Do not use the public issue tracker or discuss it in the public as it will cause more damage. Do not use the issue tracker or discuss it in the public as it will cause more damage.
## Supported Versions ## Supported Versions
Use this section to tell people about which versions of your project are
currently being supported with security updates.
### Uptime Kuma Versions ### Uptime Kuma Versions
You should use or upgrade to the latest version of Uptime Kuma. All `1.X.X` versions are upgradable to the lastest version. You should use or upgrade to the latest version of Uptime Kuma. All `1.X.X` versions are upgradable to the lastest version.

View File

@ -1,28 +0,0 @@
const { defineConfig } = require("cypress");
module.exports = defineConfig({
projectId: "vyjuem",
e2e: {
experimentalStudio: true,
setupNodeEvents(on, config) {
},
fixturesFolder: "test/cypress/fixtures",
screenshotsFolder: "test/cypress/screenshots",
videosFolder: "test/cypress/videos",
downloadsFolder: "test/cypress/downloads",
supportFile: "test/cypress/support/e2e.js",
baseUrl: "http://localhost:3002",
defaultCommandTimeout: 10000,
pageLoadTimeout: 60000,
viewportWidth: 1920,
viewportHeight: 1080,
specPattern: [
"test/cypress/e2e/setup.cy.js",
"test/cypress/e2e/**/*.js"
],
},
env: {
baseUrl: "http://localhost:3002",
},
});

View File

@ -1,10 +0,0 @@
const { defineConfig } = require("cypress");
module.exports = defineConfig({
e2e: {
supportFile: false,
specPattern: [
"test/cypress/unit/**/*.js"
],
}
});

33
config/jest-debug-env.js Normal file
View File

@ -0,0 +1,33 @@
const PuppeteerEnvironment = require("jest-environment-puppeteer");
const util = require("util");
class DebugEnv extends PuppeteerEnvironment {
async handleTestEvent(event, state) {
const ignoredEvents = [
"setup",
"add_hook",
"start_describe_definition",
"add_test",
"finish_describe_definition",
"run_start",
"run_describe_start",
"test_start",
"hook_start",
"hook_success",
"test_fn_start",
"test_fn_success",
"test_done",
"run_describe_finish",
"run_finish",
"teardown",
"test_fn_failure",
];
if (!ignoredEvents.includes(event.name)) {
console.log(
new Date().toString() + ` Unhandled event [${event.name}] ` + util.inspect(event)
);
}
}
}
module.exports = DebugEnv;

View File

@ -0,0 +1,5 @@
module.exports = {
"rootDir": "..",
"testRegex": "./test/frontend.spec.js",
};

View File

@ -0,0 +1,20 @@
module.exports = {
"launch": {
"dumpio": true,
"slowMo": 500,
"headless": process.env.HEADLESS_TEST || false,
"userDataDir": "./data/test-chrome-profile",
args: [
"--disable-setuid-sandbox",
"--disable-gpu",
"--disable-dev-shm-usage",
"--no-default-browser-check",
"--no-experiments",
"--no-first-run",
"--no-pings",
"--no-sandbox",
"--no-zygote",
"--single-process",
],
}
};

12
config/jest.config.js Normal file
View File

@ -0,0 +1,12 @@
module.exports = {
"verbose": true,
"preset": "jest-puppeteer",
"globals": {
"__DEV__": true
},
"testRegex": "./test/e2e.spec.js",
"testEnvironment": "./config/jest-debug-env.js",
"rootDir": "..",
"testTimeout": 30000,
};

View File

@ -1,38 +1,18 @@
import legacy from "@vitejs/plugin-legacy"; import legacy from "@vitejs/plugin-legacy";
import vue from "@vitejs/plugin-vue"; import vue from "@vitejs/plugin-vue";
import { defineConfig } from "vite"; import { defineConfig } from "vite";
import visualizer from "rollup-plugin-visualizer";
import viteCompression from "vite-plugin-compression";
const postCssScss = require("postcss-scss"); const postCssScss = require("postcss-scss");
const postcssRTLCSS = require("postcss-rtlcss"); const postcssRTLCSS = require("postcss-rtlcss");
const viteCompressionFilter = /\.(js|mjs|json|css|html|svg)$/i;
// https://vitejs.dev/config/ // https://vitejs.dev/config/
export default defineConfig({ export default defineConfig({
server: {
port: 3000,
},
define: {
"FRONTEND_VERSION": JSON.stringify(process.env.npm_package_version),
},
plugins: [ plugins: [
vue(), vue(),
legacy({ legacy({
targets: [ "since 2015" ], targets: [ "ie > 11" ],
}), additionalLegacyPolyfills: [ "regenerator-runtime/runtime" ]
visualizer({ })
filename: "tmp/dist-stats.html"
}),
viteCompression({
algorithm: "gzip",
filter: viteCompressionFilter,
}),
viteCompression({
algorithm: "brotliCompress",
filter: viteCompressionFilter,
}),
], ],
css: { css: {
postcss: { postcss: {
@ -41,13 +21,4 @@ export default defineConfig({
"plugins": [ postcssRTLCSS ] "plugins": [ postcssRTLCSS ]
} }
}, },
build: {
rollupOptions: {
output: {
manualChunks(id, { getModuleInfo, getModuleIds }) {
}
}
},
}
}); });

View File

@ -1,5 +0,0 @@
-- You should not modify if this have pushed to Github, unless it does serious wrong with the db.
BEGIN TRANSACTION;
ALTER TABLE monitor_group
ADD send_url BOOLEAN DEFAULT 0 NOT NULL;
COMMIT;

View File

@ -1,18 +0,0 @@
-- You should not modify if this have pushed to Github, unless it does serious wrong with the db.
BEGIN TRANSACTION;
CREATE TABLE docker_host (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
user_id INT NOT NULL,
docker_daemon VARCHAR(255),
docker_type VARCHAR(255),
name VARCHAR(255)
);
ALTER TABLE monitor
ADD docker_host INTEGER REFERENCES docker_host(id);
ALTER TABLE monitor
ADD docker_container VARCHAR(255);
COMMIT;

View File

@ -1,18 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE monitor
ADD auth_method VARCHAR(250);
ALTER TABLE monitor
ADD auth_domain TEXT;
ALTER TABLE monitor
ADD auth_workstation TEXT;
COMMIT;
BEGIN TRANSACTION;
UPDATE monitor
SET auth_method = 'basic'
WHERE basic_auth_user is not null;
COMMIT;

View File

@ -1,18 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE monitor
ADD radius_username VARCHAR(255);
ALTER TABLE monitor
ADD radius_password VARCHAR(255);
ALTER TABLE monitor
ADD radius_calling_station_id VARCHAR(50);
ALTER TABLE monitor
ADD radius_called_station_id VARCHAR(50);
ALTER TABLE monitor
ADD radius_secret VARCHAR(255);
COMMIT

View File

@ -1,10 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE monitor
ADD database_connection_string VARCHAR(2000);
ALTER TABLE monitor
ADD database_query TEXT;
COMMIT

View File

@ -1,25 +0,0 @@
-- You should not modify if this have pushed to Github, unless it does serious wrong with the db.
BEGIN TRANSACTION;
ALTER TABLE monitor
ADD grpc_url VARCHAR(255) default null;
ALTER TABLE monitor
ADD grpc_protobuf TEXT default null;
ALTER TABLE monitor
ADD grpc_body TEXT default null;
ALTER TABLE monitor
ADD grpc_metadata TEXT default null;
ALTER TABLE monitor
ADD grpc_method VARCHAR(255) default null;
ALTER TABLE monitor
ADD grpc_service_name VARCHAR(255) default null;
ALTER TABLE monitor
ADD grpc_enable_tls BOOLEAN default 0 not null;
COMMIT;

View File

@ -1,83 +0,0 @@
-- You should not modify if this have pushed to Github, unless it does serious wrong with the db.
BEGIN TRANSACTION;
-- Just for someone who tested maintenance before (patch-maintenance-table.sql)
DROP TABLE IF EXISTS maintenance_status_page;
DROP TABLE IF EXISTS monitor_maintenance;
DROP TABLE IF EXISTS maintenance;
DROP TABLE IF EXISTS maintenance_timeslot;
-- maintenance
CREATE TABLE [maintenance] (
[id] INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
[title] VARCHAR(150) NOT NULL,
[description] TEXT NOT NULL,
[user_id] INTEGER REFERENCES [user]([id]) ON DELETE SET NULL ON UPDATE CASCADE,
[active] BOOLEAN NOT NULL DEFAULT 1,
[strategy] VARCHAR(50) NOT NULL DEFAULT 'single',
[start_date] DATETIME,
[end_date] DATETIME,
[start_time] TIME,
[end_time] TIME,
[weekdays] VARCHAR2(250) DEFAULT '[]',
[days_of_month] TEXT DEFAULT '[]',
[interval_day] INTEGER
);
CREATE INDEX [manual_active] ON [maintenance] (
[strategy],
[active]
);
CREATE INDEX [active] ON [maintenance] ([active]);
CREATE INDEX [maintenance_user_id] ON [maintenance] ([user_id]);
-- maintenance_status_page
CREATE TABLE maintenance_status_page (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
status_page_id INTEGER NOT NULL,
maintenance_id INTEGER NOT NULL,
CONSTRAINT FK_maintenance FOREIGN KEY (maintenance_id) REFERENCES maintenance (id) ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT FK_status_page FOREIGN KEY (status_page_id) REFERENCES status_page (id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX [status_page_id_index]
ON [maintenance_status_page]([status_page_id]);
CREATE INDEX [maintenance_id_index]
ON [maintenance_status_page]([maintenance_id]);
-- maintenance_timeslot
CREATE TABLE [maintenance_timeslot] (
[id] INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
[maintenance_id] INTEGER NOT NULL CONSTRAINT [FK_maintenance] REFERENCES [maintenance]([id]) ON DELETE CASCADE ON UPDATE CASCADE,
[start_date] DATETIME NOT NULL,
[end_date] DATETIME,
[generated_next] BOOLEAN DEFAULT 0
);
CREATE INDEX [maintenance_id] ON [maintenance_timeslot] ([maintenance_id] DESC);
CREATE INDEX [active_timeslot_index] ON [maintenance_timeslot] (
[maintenance_id] DESC,
[start_date] DESC,
[end_date] DESC
);
CREATE INDEX [generated_next_index] ON [maintenance_timeslot] ([generated_next]);
-- monitor_maintenance
CREATE TABLE monitor_maintenance (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
monitor_id INTEGER NOT NULL,
maintenance_id INTEGER NOT NULL,
CONSTRAINT FK_maintenance FOREIGN KEY (maintenance_id) REFERENCES maintenance (id) ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT FK_monitor FOREIGN KEY (monitor_id) REFERENCES monitor (id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE INDEX [maintenance_id_index2] ON [monitor_maintenance]([maintenance_id]);
CREATE INDEX [monitor_id_index] ON [monitor_maintenance]([monitor_id]);
COMMIT;

View File

@ -1,10 +0,0 @@
-- You should not modify if this have pushed to Github, unless it does serious wrong with the db.
BEGIN TRANSACTION;
ALTER TABLE monitor
ADD resend_interval INTEGER default 0 not null;
ALTER TABLE heartbeat
ADD down_count INTEGER default 0 not null;
COMMIT;

View File

@ -4,5 +4,5 @@ WORKDIR /app
# Install apprise, iputils for non-root ping, setpriv # Install apprise, iputils for non-root ping, setpriv
RUN apk add --no-cache iputils setpriv dumb-init python3 py3-cryptography py3-pip py3-six py3-yaml py3-click py3-markdown py3-requests py3-requests-oauthlib && \ RUN apk add --no-cache iputils setpriv dumb-init python3 py3-cryptography py3-pip py3-six py3-yaml py3-click py3-markdown py3-requests py3-requests-oauthlib && \
pip3 --no-cache-dir install apprise==1.2.1 && \ pip3 --no-cache-dir install apprise==0.9.8.3 && \
rm -rf /root/.cache rm -rf /root/.cache

View File

@ -1,16 +0,0 @@
############################################
# Build in Golang
# Run npm run build-healthcheck-armv7 in the host first, another it will be super slow where it is building the armv7 healthcheck
############################################
FROM golang:1.19.4-buster
WORKDIR /app
ARG TARGETPLATFORM
COPY ./extra/ ./extra/
# Compile healthcheck.go
RUN apt update && \
apt --yes --no-install-recommends install curl && \
curl -sL https://deb.nodesource.com/setup_18.x | bash && \
apt --yes --no-install-recommends install nodejs && \
node ./extra/build-healthcheck.js $TARGETPLATFORM && \
apt --yes remove nodejs

View File

@ -11,9 +11,8 @@ WORKDIR /app
RUN apt update && \ RUN apt update && \
apt --yes --no-install-recommends install python3 python3-pip python3-cryptography python3-six python3-yaml python3-click python3-markdown python3-requests python3-requests-oauthlib \ apt --yes --no-install-recommends install python3 python3-pip python3-cryptography python3-six python3-yaml python3-click python3-markdown python3-requests python3-requests-oauthlib \
sqlite3 iputils-ping util-linux dumb-init && \ sqlite3 iputils-ping util-linux dumb-init && \
pip3 --no-cache-dir install apprise==1.2.1 && \ pip3 --no-cache-dir install apprise==0.9.8.3 && \
rm -rf /var/lib/apt/lists/* && \ rm -rf /var/lib/apt/lists/*
apt --yes autoremove
# Install cloudflared # Install cloudflared
# dpkg --add-architecture arm: cloudflared do not provide armhf, this is workaround. Read more: https://github.com/cloudflare/cloudflared/issues/583 # dpkg --add-architecture arm: cloudflared do not provide armhf, this is workaround. Read more: https://github.com/cloudflare/cloudflared/issues/583
@ -23,6 +22,5 @@ RUN node ./extra/download-cloudflared.js $TARGETPLATFORM && \
apt update && \ apt update && \
apt --yes --no-install-recommends install ./cloudflared.deb && \ apt --yes --no-install-recommends install ./cloudflared.deb && \
rm -rf /var/lib/apt/lists/* && \ rm -rf /var/lib/apt/lists/* && \
rm -f cloudflared.deb && \ rm -f cloudflared.deb
apt --yes autoremove

View File

@ -1,4 +1,4 @@
# Simple docker-compose.yml # Simple docker-composer.yml
# You can change your port or volume location # You can change your port or volume location
version: '3.3' version: '3.3'

View File

@ -1,82 +1,31 @@
############################################
# Build in Golang
# Run npm run build-healthcheck-armv7 in the host first, another it will be super slow where it is building the armv7 healthcheck
# Check file: builder-go.dockerfile
############################################
FROM louislam/uptime-kuma:builder-go AS build_healthcheck
############################################
# Build in Node.js
############################################
FROM louislam/uptime-kuma:base-debian AS build FROM louislam/uptime-kuma:base-debian AS build
WORKDIR /app WORKDIR /app
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1 ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1
COPY .npmrc .npmrc
COPY package.json package.json
COPY package-lock.json package-lock.json
RUN npm ci --omit=dev
COPY . .
COPY --from=build_healthcheck /app/extra/healthcheck /app/extra/healthcheck
RUN chmod +x /app/extra/entrypoint.sh
############################################ COPY . .
# ⭐ Main Image RUN npm ci --production && \
############################################ chmod +x /app/extra/entrypoint.sh
FROM louislam/uptime-kuma:base-debian AS release FROM louislam/uptime-kuma:base-debian AS release
WORKDIR /app WORKDIR /app
# Copy app files from build layer # Copy app files from build layer
COPY --from=build /app /app COPY --from=build /app /app
EXPOSE 3001 EXPOSE 3001
VOLUME ["/app/data"] VOLUME ["/app/data"]
HEALTHCHECK --interval=60s --timeout=30s --start-period=180s --retries=5 CMD extra/healthcheck HEALTHCHECK --interval=60s --timeout=30s --start-period=180s --retries=5 CMD node extra/healthcheck.js
ENTRYPOINT ["/usr/bin/dumb-init", "--", "extra/entrypoint.sh"] ENTRYPOINT ["/usr/bin/dumb-init", "--", "extra/entrypoint.sh"]
CMD ["node", "server/server.js"] CMD ["node", "server/server.js"]
############################################
# Mark as Nightly
############################################
FROM release AS nightly FROM release AS nightly
RUN npm run mark-as-nightly RUN npm run mark-as-nightly
############################################
# Build an image for testing pr
############################################
FROM louislam/uptime-kuma:base-debian AS pr-test
WORKDIR /app
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1
## Install Git
RUN apt update \
&& apt --yes --no-install-recommends install curl \
&& curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg \
&& chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg \
&& echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" | tee /etc/apt/sources.list.d/github-cli.list > /dev/null \
&& apt update \
&& apt --yes --no-install-recommends install git
## Empty the directory, because we have to clone the Git repo.
RUN rm -rf ./* && chown node /app
USER node
RUN git config --global user.email "no-reply@no-reply.com"
RUN git config --global user.name "PR Tester"
RUN git clone https://github.com/louislam/uptime-kuma.git .
RUN npm ci
EXPOSE 3000 3001
VOLUME ["/app/data"]
HEALTHCHECK --interval=60s --timeout=30s --start-period=180s --retries=5 CMD node extra/healthcheck.js
CMD ["npm", "run", "start-pr-test"]
############################################
# Upload the artifact to Github # Upload the artifact to Github
############################################
FROM louislam/uptime-kuma:base-debian AS upload-artifact FROM louislam/uptime-kuma:base-debian AS upload-artifact
WORKDIR / WORKDIR /
RUN apt update && \ RUN apt update && \

View File

@ -3,12 +3,10 @@ WORKDIR /app
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1 ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=1
COPY .npmrc .npmrc
COPY package.json package.json
COPY package-lock.json package-lock.json
RUN npm ci --omit=dev
COPY . . COPY . .
RUN chmod +x /app/extra/entrypoint.sh RUN npm ci --production && \
chmod +x /app/extra/entrypoint.sh
FROM louislam/uptime-kuma:base-alpine AS release FROM louislam/uptime-kuma:base-alpine AS release
WORKDIR /app WORKDIR /app

View File

@ -32,10 +32,6 @@ if (! exists) {
process.exit(1); process.exit(1);
} }
/**
* Commit updated files
* @param {string} version Version to update to
*/
function commit(version) { function commit(version) {
let msg = "Update to " + version; let msg = "Update to " + version;
@ -51,10 +47,6 @@ function commit(version) {
console.log(res.stdout.toString().trim()); console.log(res.stdout.toString().trim());
} }
/**
* Create a tag with the specified version
* @param {string} version Tag to create
*/
function tag(version) { function tag(version) {
let res = childProcess.spawnSync("git", [ "tag", version ]); let res = childProcess.spawnSync("git", [ "tag", version ]);
console.log(res.stdout.toString().trim()); console.log(res.stdout.toString().trim());
@ -63,11 +55,6 @@ function tag(version) {
console.log(res.stdout.toString().trim()); console.log(res.stdout.toString().trim());
} }
/**
* Check if a tag exists for the specified version
* @param {string} version Version to check
* @returns {boolean} Does the tag already exist
*/
function tagExists(version) { function tagExists(version) {
if (! version) { if (! version) {
throw new Error("invalid version"); throw new Error("invalid version");

View File

@ -1,27 +0,0 @@
const childProcess = require("child_process");
const fs = require("fs");
const platform = process.argv[2];
if (!platform) {
console.error("No platform??");
process.exit(1);
}
if (platform === "linux/arm/v7") {
console.log("Arch: armv7");
if (fs.existsSync("./extra/healthcheck-armv7")) {
fs.renameSync("./extra/healthcheck-armv7", "./extra/healthcheck");
console.log("Already built in the host, skip.");
process.exit(0);
} else {
console.log("prebuilt not found, it will be slow! You should execute `npm run build-healthcheck-armv7` before build.");
}
} else {
if (fs.existsSync("./extra/healthcheck-armv7")) {
fs.rmSync("./extra/healthcheck-armv7");
}
}
const output = childProcess.execSync("go build -x -o ./extra/healthcheck ./extra/healthcheck.go").toString("utf8");
console.log(output);

View File

@ -1,33 +0,0 @@
const childProcess = require("child_process");
if (!process.env.UPTIME_KUMA_GH_REPO) {
console.error("Please set a repo to the environment variable 'UPTIME_KUMA_GH_REPO' (e.g. mhkarimi1383:goalert-notification)");
process.exit(1);
}
let inputArray = process.env.UPTIME_KUMA_GH_REPO.split(":");
if (inputArray.length !== 2) {
console.error("Invalid format. Please set a repo to the environment variable 'UPTIME_KUMA_GH_REPO' (e.g. mhkarimi1383:goalert-notification)");
}
let name = inputArray[0];
let branch = inputArray[1];
console.log("Checkout pr");
// Checkout the pr
let result = childProcess.spawnSync("git", [ "remote", "add", name, `https://github.com/${name}/uptime-kuma` ]);
console.log(result.stdout.toString());
console.error(result.stderr.toString());
result = childProcess.spawnSync("git", [ "fetch", name, branch ]);
console.log(result.stdout.toString());
console.error(result.stderr.toString());
result = childProcess.spawnSync("git", [ "checkout", `${name}/${branch}`, "--force" ]);
console.log(result.stdout.toString());
console.error(result.stderr.toString());

View File

@ -25,10 +25,6 @@ if (platform === "linux/amd64") {
const file = fs.createWriteStream("cloudflared.deb"); const file = fs.createWriteStream("cloudflared.deb");
get("https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-" + arch + ".deb"); get("https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-" + arch + ".deb");
/**
* Download specified file
* @param {string} url URL to request
*/
function get(url) { function get(url) {
http.get(url, function (res) { http.get(url, function (res) {
if (res.statusCode >= 300 && res.statusCode < 400 && res.headers.location) { if (res.statusCode >= 300 && res.statusCode < 400 && res.headers.location) {

View File

@ -1,81 +0,0 @@
/*
* If changed, have to run `npm run build-docker-builder-go`.
* This script should be run after a period of time (180s), because the server may need some time to prepare.
*/
package main
import (
"crypto/tls"
"io/ioutil"
"log"
"net/http"
"os"
"runtime"
"time"
)
func main() {
isFreeBSD := runtime.GOOS == "freebsd"
// process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";
http.DefaultTransport.(*http.Transport).TLSClientConfig = &tls.Config{
InsecureSkipVerify: true,
}
client := http.Client{
Timeout: 28 * time.Second,
}
sslKey := os.Getenv("UPTIME_KUMA_SSL_KEY")
if len(sslKey) == 0 {
sslKey = os.Getenv("SSL_KEY")
}
sslCert := os.Getenv("UPTIME_KUMA_SSL_CERT")
if len(sslCert) == 0 {
sslCert = os.Getenv("SSL_CERT")
}
hostname := os.Getenv("UPTIME_KUMA_HOST")
if len(hostname) == 0 && !isFreeBSD {
hostname = os.Getenv("HOST")
}
if len(hostname) == 0 {
hostname = "127.0.0.1"
}
port := os.Getenv("UPTIME_KUMA_PORT")
if len(port) == 0 {
port = os.Getenv("PORT")
}
if len(port) == 0 {
port = "3001"
}
protocol := ""
if len(sslKey) != 0 && len(sslCert) != 0 {
protocol = "https"
} else {
protocol = "http"
}
url := protocol + "://" + hostname + ":" + port
log.Println("Checking " + url)
resp, err := client.Get(url)
if err != nil {
log.Fatalln(err)
}
defer resp.Body.Close()
_, err = ioutil.ReadAll(resp.Body)
if err != nil {
log.Fatalln(err)
}
log.Printf("Health Check OK [Res Code: %d]\n", resp.StatusCode)
}

View File

@ -1,5 +1,4 @@
/* /*
* Deprecated: Changed to healthcheck.go, it will be deleted in the future.
* This script should be run after a period of time (180s), because the server may need some time to prepare. * This script should be run after a period of time (180s), because the server may need some time to prepare.
*/ */
const { FBSD } = require("../server/util-server"); const { FBSD } = require("../server/util-server");

View File

@ -5,7 +5,7 @@ const util = require("../src/util");
util.polyfill(); util.polyfill();
const oldVersion = pkg.version; const oldVersion = pkg.version;
const newVersion = oldVersion + "-nightly-" + util.genSecret(8); const newVersion = oldVersion + "-nightly";
console.log("Old Version: " + oldVersion); console.log("Old Version: " + oldVersion);
console.log("New Version: " + newVersion); console.log("New Version: " + newVersion);

View File

@ -43,11 +43,6 @@ const main = async () => {
console.log("Finished."); console.log("Finished.");
}; };
/**
* Ask question of user
* @param {string} question Question to ask
* @returns {Promise<string>} Users response
*/
function question(question) { function question(question) {
return new Promise((resolve) => { return new Promise((resolve) => {
rl.question(question, (answer) => { rl.question(question, (answer) => {

View File

@ -53,11 +53,6 @@ const main = async () => {
console.log("Finished."); console.log("Finished.");
}; };
/**
* Ask question of user
* @param {string} question Question to ask
* @returns {Promise<string>} Users response
*/
function question(question) { function question(question) {
return new Promise((resolve) => { return new Promise((resolve) => {
rl.question(question, (answer) => { rl.question(question, (answer) => {

View File

@ -135,11 +135,6 @@ server.listen({
udp: 5300 udp: 5300
}); });
/**
* Get human readable request type from request code
* @param {number} code Request code to translate
* @returns {string} Human readable request type
*/
function type(code) { function type(code) {
for (let name in Packet.TYPE) { for (let name in Packet.TYPE) {
if (Packet.TYPE[name] === code) { if (Packet.TYPE[name] === code) {

View File

@ -11,7 +11,6 @@ class SimpleMqttServer {
this.port = port; this.port = port;
} }
/** Start the MQTT server */
start() { start() {
this.server.listen(this.port, () => { this.server.listen(this.port, () => {
console.log("server started and listening on port ", this.port); console.log("server started and listening on port ", this.port);

View File

@ -1,45 +1,51 @@
// Need to use ES6 to read language files // Need to use ES6 to read language files
import fs from "fs"; import fs from "fs";
import path from "path";
import util from "util"; import util from "util";
import rmSync from "../fs-rmSync.js"; import rmSync from "../fs-rmSync.js";
// https://stackoverflow.com/questions/13786160/copy-folder-recursively-in-node-js
/** /**
* Copy across the required language files * Look ma, it's cp -R.
* Creates a local directory (./languages) and copies the required files * @param {string} src The path to the thing to copy.
* into it. * @param {string} dest The path to the new copy.
* @param {string} langCode Code of language to update. A file will be
* created with this code if one does not already exist
* @param {string} baseLang The second base language file to copy. This
* will be ignored if set to "en" as en.js is copied by default
*/ */
function copyFiles(langCode, baseLang) { const copyRecursiveSync = function (src, dest) {
if (fs.existsSync("./languages")) { let exists = fs.existsSync(src);
rmSync("./languages", { recursive: true }); let stats = exists && fs.statSync(src);
} let isDirectory = exists && stats.isDirectory();
fs.mkdirSync("./languages");
if (!fs.existsSync(`../../src/languages/${langCode}.js`)) { if (isDirectory) {
fs.closeSync(fs.openSync(`./languages/${langCode}.js`, "a")); fs.mkdirSync(dest);
fs.readdirSync(src).forEach(function (childItemName) {
copyRecursiveSync(path.join(src, childItemName),
path.join(dest, childItemName));
});
} else { } else {
fs.copyFileSync(`../../src/languages/${langCode}.js`, `./languages/${langCode}.js`); fs.copyFileSync(src, dest);
}
fs.copyFileSync("../../src/languages/en.js", "./languages/en.js");
if (baseLang !== "en") {
fs.copyFileSync(`../../src/languages/${baseLang}.js`, `./languages/${baseLang}.js`);
} }
};
console.log("Arguments:", process.argv);
const baseLangCode = process.argv[2] || "en";
console.log("Base Lang: " + baseLangCode);
if (fs.existsSync("./languages")) {
rmSync("./languages", { recursive: true });
} }
copyRecursiveSync("../../src/languages", "./languages");
/** const en = (await import("./languages/en.js")).default;
* Update the specified language file const baseLang = (await import(`./languages/${baseLangCode}.js`)).default;
* @param {string} langCode Language code to update const files = fs.readdirSync("./languages");
* @param {string} baseLang Second language to copy keys from console.log("Files:", files);
*/
async function updateLanguage(langCode, baseLangCode) { for (const file of files) {
const en = (await import("./languages/en.js")).default; if (! file.endsWith(".js")) {
const baseLang = (await import(`./languages/${baseLangCode}.js`)).default; console.log("Skipping " + file);
continue;
}
let file = langCode + ".js";
console.log("Processing " + file); console.log("Processing " + file);
const lang = await import("./languages/" + file); const lang = await import("./languages/" + file);
@ -77,20 +83,5 @@ async function updateLanguage(langCode, baseLangCode) {
fs.writeFileSync(`../../src/languages/${file}`, code); fs.writeFileSync(`../../src/languages/${file}`, code);
} }
// Get command line arguments
const baseLangCode = process.env.npm_config_baselang || "en";
const langCode = process.env.npm_config_language;
// We need the file to edit
if (langCode == null) {
throw new Error("Argument --language=<code> must be provided");
}
console.log("Base Lang: " + baseLangCode);
console.log("Updating: " + langCode);
copyFiles(langCode, baseLangCode);
await updateLanguage(langCode, baseLangCode);
rmSync("./languages", { recursive: true }); rmSync("./languages", { recursive: true });
console.log("Done. Fixing formatting by ESLint..."); console.log("Done. Fixing formatting by ESLint...");

View File

@ -36,8 +36,10 @@ if (! exists) {
} }
/** /**
* Commit updated files * Updates the version number in package.json and commits it to git.
* @param {string} version Version to update to * @param {string} version - The new version number
*
* Generated by Trelent
*/ */
function commit(version) { function commit(version) {
let msg = "Update to " + version; let msg = "Update to " + version;
@ -51,19 +53,16 @@ function commit(version) {
} }
} }
/**
* Create a tag with the specified version
* @param {string} version Tag to create
*/
function tag(version) { function tag(version) {
let res = childProcess.spawnSync("git", [ "tag", version ]); let res = childProcess.spawnSync("git", [ "tag", version ]);
console.log(res.stdout.toString().trim()); console.log(res.stdout.toString().trim());
} }
/** /**
* Check if a tag exists for the specified version * Checks if a given version is already tagged in the git repository.
* @param {string} version Version to check * @param {string} version - The version to check for.
* @returns {boolean} Does the tag already exist *
* Generated by Trelent
*/ */
function tagExists(version) { function tagExists(version) {
if (! version) { if (! version) {

View File

@ -10,10 +10,6 @@ if (!newVersion) {
updateWiki(newVersion); updateWiki(newVersion);
/**
* Update the wiki with new version number
* @param {string} newVersion Version to update to
*/
function updateWiki(newVersion) { function updateWiki(newVersion) {
const wikiDir = "./tmp/wiki"; const wikiDir = "./tmp/wiki";
const howToUpdateFilename = "./tmp/wiki/🆙-How-to-Update.md"; const howToUpdateFilename = "./tmp/wiki/🆙-How-to-Update.md";
@ -43,13 +39,9 @@ function updateWiki(newVersion) {
safeDelete(wikiDir); safeDelete(wikiDir);
} }
/**
* Check if a directory exists and then delete it
* @param {string} dir Directory to delete
*/
function safeDelete(dir) { function safeDelete(dir) {
if (fs.existsSync(dir)) { if (fs.existsSync(dir)) {
fs.rm(dir, { fs.rmdirSync(dir, {
recursive: true, recursive: true,
}); });
} }

21085
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{ {
"name": "uptime-kuma", "name": "uptime-kuma",
"version": "1.19.6", "version": "1.16.0",
"license": "MIT", "license": "MIT",
"repository": { "repository": {
"type": "git", "type": "git",
@ -23,23 +23,23 @@
"start-server": "node server/server.js", "start-server": "node server/server.js",
"start-server-dev": "cross-env NODE_ENV=development node server/server.js", "start-server-dev": "cross-env NODE_ENV=development node server/server.js",
"build": "vite build --config ./config/vite.config.js", "build": "vite build --config ./config/vite.config.js",
"test": "node test/prepare-test-server.js && npm run jest-backend", "test": "node test/prepare-test-server.js && node server/server.js --port=3002 --data-dir=./data/test/ --test",
"test-with-build": "npm run build && npm test", "test-with-build": "npm run build && npm test",
"jest-backend": "cross-env TEST_BACKEND=1 jest --runInBand --detectOpenHandles --forceExit --config=./config/jest-backend.config.js", "jest": "node test/prepare-jest.js && npm run jest-frontend && npm run jest-backend",
"jest-frontend": "cross-env TEST_FRONTEND=1 jest --config=./config/jest-frontend.config.js",
"jest-backend": "cross-env TEST_BACKEND=1 jest --config=./config/jest-backend.config.js",
"tsc": "tsc", "tsc": "tsc",
"vite-preview-dist": "vite preview --host --config ./config/vite.config.js", "vite-preview-dist": "vite preview --host --config ./config/vite.config.js",
"build-docker": "npm run build && npm run build-docker-debian && npm run build-docker-alpine", "build-docker": "npm run build && npm run build-docker-debian && npm run build-docker-alpine",
"build-docker-alpine-base": "docker buildx build -f docker/alpine-base.dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:base-alpine . --push", "build-docker-alpine-base": "docker buildx build -f docker/alpine-base.dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:base-alpine . --push",
"build-docker-debian-base": "docker buildx build -f docker/debian-base.dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:base-debian . --push", "build-docker-debian-base": "docker buildx build -f docker/debian-base.dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:base-debian . --push",
"build-docker-builder-go": "docker buildx build -f docker/builder-go.dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:builder-go . --push",
"build-docker-alpine": "node ./extra/env2arg.js docker buildx build -f docker/dockerfile-alpine --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:alpine -t louislam/uptime-kuma:1-alpine -t louislam/uptime-kuma:$VERSION-alpine --target release . --push", "build-docker-alpine": "node ./extra/env2arg.js docker buildx build -f docker/dockerfile-alpine --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:alpine -t louislam/uptime-kuma:1-alpine -t louislam/uptime-kuma:$VERSION-alpine --target release . --push",
"build-docker-debian": "node ./extra/env2arg.js docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma -t louislam/uptime-kuma:1 -t louislam/uptime-kuma:$VERSION -t louislam/uptime-kuma:debian -t louislam/uptime-kuma:1-debian -t louislam/uptime-kuma:$VERSION-debian --target release . --push", "build-docker-debian": "node ./extra/env2arg.js docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma -t louislam/uptime-kuma:1 -t louislam/uptime-kuma:$VERSION -t louislam/uptime-kuma:debian -t louislam/uptime-kuma:1-debian -t louislam/uptime-kuma:$VERSION-debian --target release . --push",
"build-docker-nightly": "npm run build && docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:nightly --target nightly . --push", "build-docker-nightly": "npm run build && docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:nightly --target nightly . --push",
"build-docker-nightly-alpine": "docker buildx build -f docker/dockerfile-alpine --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:nightly-alpine --target nightly . --push", "build-docker-nightly-alpine": "docker buildx build -f docker/dockerfile-alpine --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:nightly-alpine --target nightly . --push",
"build-docker-nightly-amd64": "docker buildx build -f docker/dockerfile --platform linux/amd64 -t louislam/uptime-kuma:nightly-amd64 --target nightly . --push --progress plain", "build-docker-nightly-amd64": "docker buildx build -f docker/dockerfile --platform linux/amd64 -t louislam/uptime-kuma:nightly-amd64 --target nightly . --push --progress plain",
"build-docker-pr-test": "docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64 -t louislam/uptime-kuma:pr-test --target pr-test . --push",
"upload-artifacts": "docker buildx build -f docker/dockerfile --platform linux/amd64 -t louislam/uptime-kuma:upload-artifact --build-arg VERSION --build-arg GITHUB_TOKEN --target upload-artifact . --progress plain", "upload-artifacts": "docker buildx build -f docker/dockerfile --platform linux/amd64 -t louislam/uptime-kuma:upload-artifact --build-arg VERSION --build-arg GITHUB_TOKEN --target upload-artifact . --progress plain",
"setup": "git checkout 1.19.6 && npm ci --production && npm run download-dist", "setup": "git checkout 1.16.0 && npm ci --production && npm run download-dist",
"download-dist": "node extra/download-dist.js", "download-dist": "node extra/download-dist.js",
"mark-as-nightly": "node extra/mark-as-nightly.js", "mark-as-nightly": "node extra/mark-as-nightly.js",
"reset-password": "node extra/reset-password.js", "reset-password": "node extra/reset-password.js",
@ -52,129 +52,104 @@
"test-nodejs16": "docker build --progress plain -f test/ubuntu-nodejs16.dockerfile .", "test-nodejs16": "docker build --progress plain -f test/ubuntu-nodejs16.dockerfile .",
"simple-dns-server": "node extra/simple-dns-server.js", "simple-dns-server": "node extra/simple-dns-server.js",
"simple-mqtt-server": "node extra/simple-mqtt-server.js", "simple-mqtt-server": "node extra/simple-mqtt-server.js",
"update-language-files": "cd extra/update-language-files && node index.js && cross-env-shell eslint ../../src/languages/$npm_config_language.js --fix", "update-language-files-with-base-lang": "cd extra/update-language-files && node index.js %npm_config_base_lang% && eslint ../../src/languages/**.js --fix",
"update-language-files": "cd extra/update-language-files && node index.js && eslint ../../src/languages/**.js --fix",
"ncu-patch": "npm-check-updates -u -t patch", "ncu-patch": "npm-check-updates -u -t patch",
"release-final": "node extra/update-version.js && npm run build-docker && node ./extra/press-any-key.js && npm run upload-artifacts && node ./extra/update-wiki-version.js", "release-final": "node extra/update-version.js && npm run build-docker && node ./extra/press-any-key.js && npm run upload-artifacts && node ./extra/update-wiki-version.js",
"release-beta": "node extra/beta/update-version.js && npm run build && node ./extra/env2arg.js docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:$VERSION -t louislam/uptime-kuma:beta . --target release --push && node ./extra/press-any-key.js && npm run upload-artifacts", "release-beta": "node extra/beta/update-version.js && npm run build && node ./extra/env2arg.js docker buildx build -f docker/dockerfile --platform linux/amd64,linux/arm64,linux/arm/v7 -t louislam/uptime-kuma:$VERSION -t louislam/uptime-kuma:beta . --target release --push && node ./extra/press-any-key.js && npm run upload-artifacts",
"git-remove-tag": "git tag -d", "git-remove-tag": "git tag -d"
"build-dist-and-restart": "npm run build && npm run start-server-dev",
"start-pr-test": "node extra/checkout-pr.js && npm install && npm run dev",
"cy:test": "node test/prepare-test-server.js && node server/server.js --port=3002 --data-dir=./data/test/ --e2e",
"cy:run": "npx cypress run --browser chrome --headless --config-file ./config/cypress.config.js",
"cy:run:unit": "npx cypress run --browser chrome --headless --config-file ./config/cypress.frontend.config.js",
"cypress-open": "concurrently -k -r \"node test/prepare-test-server.js && node server/server.js --port=3002 --data-dir=./data/test/\" \"cypress open --config-file ./config/cypress.config.js\"",
"build-healthcheck-armv7": "cross-env GOOS=linux GOARCH=arm GOARM=7 go build -x -o ./extra/healthcheck-armv7 ./extra/healthcheck.go"
}, },
"dependencies": { "dependencies": {
"@grpc/grpc-js": "~1.7.3", "@fortawesome/fontawesome-svg-core": "~1.2.36",
"@louislam/ping": "~0.4.2-mod.1", "@fortawesome/free-regular-svg-icons": "~5.15.4",
"@louislam/sqlite3": "15.1.2", "@fortawesome/free-solid-svg-icons": "~5.15.4",
"@fortawesome/vue-fontawesome": "~3.0.0-5",
"@louislam/sqlite3": "~15.0.6",
"@popperjs/core": "~2.10.2",
"args-parser": "~1.3.0", "args-parser": "~1.3.0",
"axios": "~0.27.0", "axios": "~0.26.1",
"axios-ntlm": "1.3.0", "badge-maker": "^3.3.1",
"badge-maker": "~3.3.1",
"bcryptjs": "~2.4.3", "bcryptjs": "~2.4.3",
"bootstrap": "5.1.3",
"bree": "~7.1.5", "bree": "~7.1.5",
"cacheable-lookup": "~6.0.4", "chardet": "^1.3.0",
"chardet": "~1.4.0", "chart.js": "~3.6.2",
"chartjs-adapter-dayjs": "~1.0.0",
"check-password-strength": "^2.0.5", "check-password-strength": "^2.0.5",
"cheerio": "~1.0.0-rc.12", "chroma-js": "^2.1.2",
"chroma-js": "~2.4.2",
"command-exists": "~1.2.9", "command-exists": "~1.2.9",
"compare-versions": "~3.6.0", "compare-versions": "~3.6.0",
"compression": "~1.7.4", "dayjs": "~1.10.8",
"dayjs": "~1.11.5",
"express": "~4.17.3", "express": "~4.17.3",
"express-basic-auth": "~1.2.1", "express-basic-auth": "~1.2.1",
"express-static-gzip": "~2.1.7", "favico.js": "^0.3.10",
"form-data": "~4.0.0", "form-data": "~4.0.0",
"http-graceful-shutdown": "~3.1.7", "http-graceful-shutdown": "~3.1.7",
"http-proxy-agent": "~5.0.0", "http-proxy-agent": "^5.0.0",
"https-proxy-agent": "~5.0.1", "https-proxy-agent": "^5.0.0",
"iconv-lite": "~0.6.3", "iconv-lite": "^0.6.3",
"jsesc": "~3.0.2", "jsonwebtoken": "~8.5.1",
"jsonwebtoken": "~9.0.0", "jwt-decode": "^3.1.2",
"jwt-decode": "~3.1.2", "limiter": "^2.1.0",
"limiter": "~2.1.0", "mqtt": "^4.2.8",
"mongodb": "~4.13.0",
"mqtt": "~4.3.7",
"mssql": "~8.1.4",
"mysql2": "~2.3.3",
"node-cloudflared-tunnel": "~1.0.9", "node-cloudflared-tunnel": "~1.0.9",
"node-radius-client": "~1.0.0",
"nodemailer": "~6.6.5", "nodemailer": "~6.6.5",
"notp": "~2.0.3", "notp": "~2.0.3",
"password-hash": "~1.2.2", "password-hash": "~1.2.2",
"pg": "~8.8.0", "postcss-rtlcss": "~3.4.1",
"pg-connection-string": "~2.5.0", "postcss-scss": "~4.0.3",
"prismjs": "^1.27.0",
"prom-client": "~13.2.0", "prom-client": "~13.2.0",
"prometheus-api-metrics": "~3.2.1", "prometheus-api-metrics": "~3.2.1",
"protobufjs": "~7.1.1", "qrcode": "~1.5.0",
"redbean-node": "~0.2.0", "redbean-node": "0.1.3",
"redis": "~4.5.1", "socket.io": "~4.4.1",
"socket.io": "~4.5.3", "socket.io-client": "~4.4.1",
"socket.io-client": "~4.5.3", "socks-proxy-agent": "^6.1.1",
"socks-proxy-agent": "6.1.1", "tar": "^6.1.11",
"tar": "~6.1.11",
"tcp-ping": "~0.1.1", "tcp-ping": "~0.1.1",
"thirty-two": "~1.0.2" "thirty-two": "~1.0.2",
"timezones-list": "~3.0.1",
"v-pagination-3": "~0.1.7",
"vue": "next",
"vue-chart-3": "3.0.9",
"vue-confirm-dialog": "~1.0.2",
"vue-contenteditable": "~3.0.4",
"vue-i18n": "~9.1.9",
"vue-image-crop-upload": "~3.0.3",
"vue-multiselect": "~3.0.0-alpha.2",
"vue-prism-editor": "^2.0.0-alpha.2",
"vue-qrcode": "~1.0.0",
"vue-router": "~4.0.14",
"vue-toastification": "~2.0.0-rc.5",
"vuedraggable": "~4.1.0"
}, },
"devDependencies": { "devDependencies": {
"@actions/github": "~5.0.1", "@actions/github": "~5.0.1",
"@babel/eslint-parser": "~7.17.0", "@babel/eslint-parser": "~7.17.0",
"@babel/preset-env": "^7.15.8", "@babel/preset-env": "^7.15.8",
"@fortawesome/fontawesome-svg-core": "~1.2.36",
"@fortawesome/free-regular-svg-icons": "~5.15.4",
"@fortawesome/free-solid-svg-icons": "~5.15.4",
"@fortawesome/vue-fontawesome": "~3.0.0-5",
"@popperjs/core": "~2.10.2",
"@types/bootstrap": "~5.1.9", "@types/bootstrap": "~5.1.9",
"@vitejs/plugin-legacy": "~2.1.0", "@vitejs/plugin-legacy": "~1.6.4",
"@vitejs/plugin-vue": "~3.1.0", "@vitejs/plugin-vue": "~1.9.4",
"@vue/compiler-sfc": "~3.2.36", "@vue/compiler-sfc": "~3.2.31",
"@vuepic/vue-datepicker": "~3.4.8",
"aedes": "^0.46.3", "aedes": "^0.46.3",
"babel-plugin-rewire": "~1.2.0", "babel-plugin-rewire": "~1.2.0",
"bootstrap": "5.1.3",
"chart.js": "~3.6.2",
"chartjs-adapter-dayjs": "~1.0.0",
"concurrently": "^7.1.0", "concurrently": "^7.1.0",
"core-js": "~3.26.1", "core-js": "~3.18.3",
"cross-env": "~7.0.3", "cross-env": "~7.0.3",
"cypress": "^10.1.0",
"delay": "^5.0.0",
"dns2": "~2.0.1", "dns2": "~2.0.1",
"eslint": "~8.14.0", "eslint": "~8.14.0",
"eslint-plugin-vue": "~8.7.1", "eslint-plugin-vue": "~8.7.1",
"favico.js": "~0.3.10",
"jest": "~27.2.5", "jest": "~27.2.5",
"postcss-html": "~1.5.0", "jest-puppeteer": "~6.0.3",
"postcss-rtlcss": "~3.7.2", "npm-check-updates": "^12.5.9",
"postcss-scss": "~4.0.4", "postcss-html": "^1.3.1",
"prismjs": "~1.29.0", "puppeteer": "~13.1.3",
"qrcode": "~1.5.0",
"rollup-plugin-visualizer": "^5.6.0",
"sass": "~1.42.1", "sass": "~1.42.1",
"stylelint": "~14.7.1", "stylelint": "~14.7.1",
"stylelint-config-standard": "~25.0.0", "stylelint-config-standard": "~25.0.0",
"terser": "~5.15.0",
"timezones-list": "~3.0.1",
"typescript": "~4.4.4", "typescript": "~4.4.4",
"v-pagination-3": "~0.1.7", "vite": "~2.6.14",
"vite": "~3.1.0",
"vite-plugin-compression": "^0.5.1",
"vue": "next",
"vue-chart-3": "3.0.9",
"vue-confirm-dialog": "~1.0.2",
"vue-contenteditable": "~3.0.4",
"vue-i18n": "~9.2.2",
"vue-image-crop-upload": "~3.0.3",
"vue-multiselect": "~3.0.0-alpha.2",
"vue-prism-editor": "~2.0.0-alpha.2",
"vue-qrcode": "~1.0.0",
"vue-router": "~4.0.14",
"vue-toastification": "~2.0.0-rc.5",
"vuedraggable": "~4.1.0",
"wait-on": "^6.0.1" "wait-on": "^6.0.1"
} }
} }

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 893 B

After

Width:  |  Height:  |  Size: 6.4 KiB

View File

@ -63,12 +63,6 @@ function myAuthorizer(username, password, callback) {
}); });
} }
/**
* Use basic auth if auth is not disabled
* @param {express.Request} req Express request object
* @param {express.Response} res Express response object
* @param {express.NextFunction} next
*/
exports.basicAuth = async function (req, res, next) { exports.basicAuth = async function (req, res, next) {
const middleware = basicAuth({ const middleware = basicAuth({
authorizer: myAuthorizer, authorizer: myAuthorizer,

View File

@ -1,86 +0,0 @@
const https = require("https");
const http = require("http");
const CacheableLookup = require("cacheable-lookup");
const { Settings } = require("./settings");
const { log } = require("../src/util");
class CacheableDnsHttpAgent {
static cacheable = new CacheableLookup();
static httpAgentList = {};
static httpsAgentList = {};
static enable = false;
/**
* Register/Disable cacheable to global agents
*/
static async update() {
log.debug("CacheableDnsHttpAgent", "update");
let isEnable = await Settings.get("dnsCache");
if (isEnable !== this.enable) {
log.debug("CacheableDnsHttpAgent", "value changed");
if (isEnable) {
log.debug("CacheableDnsHttpAgent", "enable");
this.cacheable.install(http.globalAgent);
this.cacheable.install(https.globalAgent);
} else {
log.debug("CacheableDnsHttpAgent", "disable");
this.cacheable.uninstall(http.globalAgent);
this.cacheable.uninstall(https.globalAgent);
}
}
this.enable = isEnable;
}
/**
* Attach cacheable to HTTP agent
* @param {http.Agent} agent Agent to install
*/
static install(agent) {
this.cacheable.install(agent);
}
/**
* @var {https.AgentOptions} agentOptions
* @return {https.Agent}
*/
static getHttpsAgent(agentOptions) {
if (!this.enable) {
return new https.Agent(agentOptions);
}
let key = JSON.stringify(agentOptions);
if (!(key in this.httpsAgentList)) {
this.httpsAgentList[key] = new https.Agent(agentOptions);
this.cacheable.install(this.httpsAgentList[key]);
}
return this.httpsAgentList[key];
}
/**
* @var {http.AgentOptions} agentOptions
* @return {https.Agents}
*/
static getHttpAgent(agentOptions) {
if (!this.enable) {
return new http.Agent(agentOptions);
}
let key = JSON.stringify(agentOptions);
if (!(key in this.httpAgentList)) {
this.httpAgentList[key] = new http.Agent(agentOptions);
this.cacheable.install(this.httpAgentList[key]);
}
return this.httpAgentList[key];
}
}
module.exports = {
CacheableDnsHttpAgent,
};

View File

@ -25,7 +25,7 @@ exports.startInterval = () => {
let checkBeta = await setting("checkBeta"); let checkBeta = await setting("checkBeta");
if (checkBeta && res.data.beta) { if (checkBeta && res.data.beta) {
if (compareVersions.compare(res.data.beta, res.data.slow, ">")) { if (compareVersions.compare(res.data.beta, res.data.beta, ">")) {
exports.latestVersion = res.data.beta; exports.latestVersion = res.data.beta;
return; return;
} }

View File

@ -4,8 +4,7 @@
const { TimeLogger } = require("../src/util"); const { TimeLogger } = require("../src/util");
const { R } = require("redbean-node"); const { R } = require("redbean-node");
const { UptimeKumaServer } = require("./uptime-kuma-server"); const { UptimeKumaServer } = require("./uptime-kuma-server");
const server = UptimeKumaServer.getInstance(); const io = UptimeKumaServer.getInstance().io;
const io = server.io;
const { setting } = require("./util-server"); const { setting } = require("./util-server");
const checkVersion = require("./check-version"); const checkVersion = require("./check-version");
@ -23,10 +22,7 @@ async function sendNotificationList(socket) {
]); ]);
for (let bean of list) { for (let bean of list) {
let notificationObject = bean.export(); result.push(bean.export());
notificationObject.isDefault = (notificationObject.isDefault === 1);
notificationObject.active = (notificationObject.active === 1);
result.push(notificationObject);
} }
io.to(socket.userID).emit("notificationList", result); io.to(socket.userID).emit("notificationList", result);
@ -122,41 +118,14 @@ async function sendInfo(socket) {
socket.emit("info", { socket.emit("info", {
version: checkVersion.version, version: checkVersion.version,
latestVersion: checkVersion.latestVersion, latestVersion: checkVersion.latestVersion,
primaryBaseURL: await setting("primaryBaseURL"), primaryBaseURL: await setting("primaryBaseURL")
serverTimezone: await server.getTimezone(),
serverTimezoneOffset: server.getTimezoneOffset(),
}); });
} }
/**
* Send list of docker hosts to client
* @param {Socket} socket Socket.io socket instance
* @returns {Promise<Bean[]>}
*/
async function sendDockerHostList(socket) {
const timeLogger = new TimeLogger();
let result = [];
let list = await R.find("docker_host", " user_id = ? ", [
socket.userID,
]);
for (let bean of list) {
result.push(bean.toJSON());
}
io.to(socket.userID).emit("dockerHostList", result);
timeLogger.print("Send Docker Host List");
return list;
}
module.exports = { module.exports = {
sendNotificationList, sendNotificationList,
sendImportantHeartbeatList, sendImportantHeartbeatList,
sendHeartbeatList, sendHeartbeatList,
sendProxyList, sendProxyList,
sendInfo, sendInfo,
sendDockerHostList
}; };

View File

@ -53,19 +53,11 @@ class Database {
"patch-2fa-invalidate-used-token.sql": true, "patch-2fa-invalidate-used-token.sql": true,
"patch-notification_sent_history.sql": true, "patch-notification_sent_history.sql": true,
"patch-monitor-basic-auth.sql": true, "patch-monitor-basic-auth.sql": true,
"patch-add-docker-columns.sql": true,
"patch-status-page.sql": true, "patch-status-page.sql": true,
"patch-proxy.sql": true, "patch-proxy.sql": true,
"patch-monitor-expiry-notification.sql": true, "patch-monitor-expiry-notification.sql": true,
"patch-status-page-footer-css.sql": true, "patch-status-page-footer-css.sql": true,
"patch-added-mqtt-monitor.sql": true, "patch-added-mqtt-monitor.sql": true,
"patch-add-clickable-status-page-link.sql": true,
"patch-add-sqlserver-monitor.sql": true,
"patch-add-other-auth.sql": { parents: [ "patch-monitor-basic-auth.sql" ] },
"patch-grpc-monitor.sql": true,
"patch-add-radius-monitor.sql": true,
"patch-monitor-add-resend-interval.sql": true,
"patch-maintenance-table2.sql": true,
}; };
/** /**
@ -183,13 +175,7 @@ class Database {
} else { } else {
log.info("db", "Database patch is needed"); log.info("db", "Database patch is needed");
try { this.backup(version);
this.backup(version);
} catch (e) {
log.error("db", e);
log.error("db", "Unable to create a backup before patching the database. Please make sure you have enough space and permission.");
process.exit(1);
}
// Try catch anything here, if gone wrong, restore the backup // Try catch anything here, if gone wrong, restore the backup
try { try {
@ -457,23 +443,6 @@ class Database {
this.backupWalPath = walPath + ".bak" + version; this.backupWalPath = walPath + ".bak" + version;
fs.copyFileSync(walPath, this.backupWalPath); fs.copyFileSync(walPath, this.backupWalPath);
} }
// Double confirm if all files actually backup
if (!fs.existsSync(this.backupPath)) {
throw new Error("Backup failed! " + this.backupPath);
}
if (fs.existsSync(shmPath)) {
if (!fs.existsSync(this.backupShmPath)) {
throw new Error("Backup failed! " + this.backupShmPath);
}
}
if (fs.existsSync(walPath)) {
if (!fs.existsSync(this.backupWalPath)) {
throw new Error("Backup failed! " + this.backupWalPath);
}
}
} }
} }

View File

@ -1,118 +0,0 @@
const axios = require("axios");
const { R } = require("redbean-node");
const version = require("../package.json").version;
const https = require("https");
class DockerHost {
/**
* Save a docker host
* @param {Object} dockerHost Docker host to save
* @param {?number} dockerHostID ID of the docker host to update
* @param {number} userID ID of the user who adds the docker host
* @returns {Promise<Bean>}
*/
static async save(dockerHost, dockerHostID, userID) {
let bean;
if (dockerHostID) {
bean = await R.findOne("docker_host", " id = ? AND user_id = ? ", [ dockerHostID, userID ]);
if (!bean) {
throw new Error("docker host not found");
}
} else {
bean = R.dispense("docker_host");
}
bean.user_id = userID;
bean.docker_daemon = dockerHost.dockerDaemon;
bean.docker_type = dockerHost.dockerType;
bean.name = dockerHost.name;
await R.store(bean);
return bean;
}
/**
* Delete a Docker host
* @param {number} dockerHostID ID of the Docker host to delete
* @param {number} userID ID of the user who created the Docker host
* @returns {Promise<void>}
*/
static async delete(dockerHostID, userID) {
let bean = await R.findOne("docker_host", " id = ? AND user_id = ? ", [ dockerHostID, userID ]);
if (!bean) {
throw new Error("docker host not found");
}
// Delete removed proxy from monitors if exists
await R.exec("UPDATE monitor SET docker_host = null WHERE docker_host = ?", [ dockerHostID ]);
await R.trash(bean);
}
/**
* Fetches the amount of containers on the Docker host
* @param {Object} dockerHost Docker host to check for
* @returns {number} Total amount of containers on the host
*/
static async testDockerHost(dockerHost) {
const options = {
url: "/containers/json?all=true",
headers: {
"Accept": "*/*",
"User-Agent": "Uptime-Kuma/" + version
},
httpsAgent: new https.Agent({
maxCachedSessions: 0, // Use Custom agent to disable session reuse (https://github.com/nodejs/node/issues/3940)
rejectUnauthorized: false,
}),
};
if (dockerHost.dockerType === "socket") {
options.socketPath = dockerHost.dockerDaemon;
} else if (dockerHost.dockerType === "tcp") {
options.baseURL = DockerHost.patchDockerURL(dockerHost.dockerDaemon);
}
let res = await axios.request(options);
if (Array.isArray(res.data)) {
if (res.data.length > 1) {
if ("ImageID" in res.data[0]) {
return res.data.length;
} else {
throw new Error("Invalid Docker response, is it Docker really a daemon?");
}
} else {
return res.data.length;
}
} else {
throw new Error("Invalid Docker response, is it Docker really a daemon?");
}
}
/**
* Since axios 0.27.X, it does not accept `tcp://` protocol.
* Change it to `http://` on the fly in order to fix it. (https://github.com/louislam/uptime-kuma/issues/2165)
*/
static patchDockerURL(url) {
if (typeof url === "string") {
// Replace the first occurrence only with g
return url.replace(/tcp:\/\//g, "http://");
}
return url;
}
}
module.exports = {
DockerHost,
};

View File

@ -32,7 +32,6 @@ const initBackgroundJobs = function (args) {
return bree; return bree;
}; };
/** Stop all background jobs if running */
const stopBackgroundJobs = function () { const stopBackgroundJobs = function () {
if (bree) { if (bree) {
bree.stop(); bree.stop();

View File

@ -25,20 +25,15 @@ const DEFAULT_KEEP_PERIOD = 180;
parsedPeriod = DEFAULT_KEEP_PERIOD; parsedPeriod = DEFAULT_KEEP_PERIOD;
} }
if (parsedPeriod < 1) { log(`Clearing Data older than ${parsedPeriod} days...`);
log(`Data deletion has been disabled as period is less than 1. Period is ${parsedPeriod} days.`);
} else {
log(`Clearing Data older than ${parsedPeriod} days...`); try {
await R.exec(
try { "DELETE FROM heartbeat WHERE time < DATETIME('now', '-' || ? || ' days') ",
await R.exec( [ parsedPeriod ]
"DELETE FROM heartbeat WHERE time < DATETIME('now', '-' || ? || ' days') ", );
[ parsedPeriod ] } catch (e) {
); log(`Failed to clear old data: ${e.message}`);
} catch (e) {
log(`Failed to clear old data: ${e.message}`);
}
} }
exit(); exit();

View File

@ -1,19 +0,0 @@
const { BeanModel } = require("redbean-node/dist/bean-model");
class DockerHost extends BeanModel {
/**
* Returns an object that ready to parse to JSON
* @returns {Object}
*/
toJSON() {
return {
id: this.id,
userID: this.user_id,
dockerDaemon: this.docker_daemon,
dockerType: this.docker_type,
name: this.name,
};
}
}
module.exports = DockerHost;

View File

@ -31,7 +31,7 @@ class Group extends BeanModel {
*/ */
async getMonitorList() { async getMonitorList() {
return R.convertToBeans("monitor", await R.getAll(` return R.convertToBeans("monitor", await R.getAll(`
SELECT monitor.*, monitor_group.send_url FROM monitor, monitor_group SELECT monitor.* FROM monitor, monitor_group
WHERE monitor.id = monitor_group.monitor_id WHERE monitor.id = monitor_group.monitor_id
AND group_id = ? AND group_id = ?
ORDER BY monitor_group.weight ORDER BY monitor_group.weight

View File

@ -1,3 +1,8 @@
const dayjs = require("dayjs");
const utc = require("dayjs/plugin/utc");
let timezone = require("dayjs/plugin/timezone");
dayjs.extend(utc);
dayjs.extend(timezone);
const { BeanModel } = require("redbean-node/dist/bean-model"); const { BeanModel } = require("redbean-node/dist/bean-model");
/** /**
@ -5,7 +10,6 @@ const { BeanModel } = require("redbean-node/dist/bean-model");
* 0 = DOWN * 0 = DOWN
* 1 = UP * 1 = UP
* 2 = PENDING * 2 = PENDING
* 3 = MAINTENANCE
*/ */
class Heartbeat extends BeanModel { class Heartbeat extends BeanModel {

View File

@ -1,240 +0,0 @@
const { BeanModel } = require("redbean-node/dist/bean-model");
const { parseTimeObject, parseTimeFromTimeObject, utcToLocal, localToUTC, log } = require("../../src/util");
const { timeObjectToUTC, timeObjectToLocal } = require("../util-server");
const { R } = require("redbean-node");
const dayjs = require("dayjs");
class Maintenance extends BeanModel {
/**
* Return an object that ready to parse to JSON for public
* Only show necessary data to public
* @returns {Object}
*/
async toPublicJSON() {
let dateRange = [];
if (this.start_date) {
dateRange.push(utcToLocal(this.start_date));
if (this.end_date) {
dateRange.push(utcToLocal(this.end_date));
}
}
let timeRange = [];
let startTime = timeObjectToLocal(parseTimeObject(this.start_time));
timeRange.push(startTime);
let endTime = timeObjectToLocal(parseTimeObject(this.end_time));
timeRange.push(endTime);
let obj = {
id: this.id,
title: this.title,
description: this.description,
strategy: this.strategy,
intervalDay: this.interval_day,
active: !!this.active,
dateRange: dateRange,
timeRange: timeRange,
weekdays: (this.weekdays) ? JSON.parse(this.weekdays) : [],
daysOfMonth: (this.days_of_month) ? JSON.parse(this.days_of_month) : [],
timeslotList: [],
};
const timeslotList = await this.getTimeslotList();
for (let timeslot of timeslotList) {
obj.timeslotList.push(await timeslot.toPublicJSON());
}
if (!Array.isArray(obj.weekdays)) {
obj.weekdays = [];
}
if (!Array.isArray(obj.daysOfMonth)) {
obj.daysOfMonth = [];
}
// Maintenance Status
if (!obj.active) {
obj.status = "inactive";
} else if (obj.strategy === "manual") {
obj.status = "under-maintenance";
} else if (obj.timeslotList.length > 0) {
let currentTimestamp = dayjs().unix();
for (let timeslot of obj.timeslotList) {
if (dayjs.utc(timeslot.startDate).unix() <= currentTimestamp && dayjs.utc(timeslot.endDate).unix() >= currentTimestamp) {
log.debug("timeslot", "Timeslot ID: " + timeslot.id);
log.debug("timeslot", "currentTimestamp:" + currentTimestamp);
log.debug("timeslot", "timeslot.start_date:" + dayjs.utc(timeslot.startDate).unix());
log.debug("timeslot", "timeslot.end_date:" + dayjs.utc(timeslot.endDate).unix());
obj.status = "under-maintenance";
break;
}
}
if (!obj.status) {
obj.status = "scheduled";
}
} else if (obj.timeslotList.length === 0) {
obj.status = "ended";
} else {
obj.status = "unknown";
}
return obj;
}
/**
* Only get future or current timeslots only
* @returns {Promise<[]>}
*/
async getTimeslotList() {
return R.convertToBeans("maintenance_timeslot", await R.getAll(`
SELECT maintenance_timeslot.*
FROM maintenance_timeslot, maintenance
WHERE maintenance_timeslot.maintenance_id = maintenance.id
AND maintenance.id = ?
AND ${Maintenance.getActiveAndFutureMaintenanceSQLCondition()}
`, [
this.id
]));
}
/**
* Return an object that ready to parse to JSON
* @param {string} timezone If not specified, the timeRange will be in UTC
* @returns {Object}
*/
async toJSON(timezone = null) {
return this.toPublicJSON(timezone);
}
/**
* Get a list of weekdays that the maintenance is active for
* Monday=1, Tuesday=2 etc.
* @returns {number[]} Array of active weekdays
*/
getDayOfWeekList() {
log.debug("timeslot", "List: " + this.weekdays);
return JSON.parse(this.weekdays).sort(function (a, b) {
return a - b;
});
}
/**
* Get a list of days in month that maintenance is active for
* @returns {number[]} Array of active days in month
*/
getDayOfMonthList() {
return JSON.parse(this.days_of_month).sort(function (a, b) {
return a - b;
});
}
/**
* Get the start date and time for maintenance
* @returns {dayjs.Dayjs} Start date and time
*/
getStartDateTime() {
let startOfTheDay = dayjs.utc(this.start_date).format("HH:mm");
log.debug("timeslot", "startOfTheDay: " + startOfTheDay);
// Start Time
let startTimeSecond = dayjs.utc(this.start_time, "HH:mm").diff(dayjs.utc(startOfTheDay, "HH:mm"), "second");
log.debug("timeslot", "startTime: " + startTimeSecond);
// Bake StartDate + StartTime = Start DateTime
return dayjs.utc(this.start_date).add(startTimeSecond, "second");
}
/**
* Get the duraction of maintenance in seconds
* @returns {number} Duration of maintenance
*/
getDuration() {
let duration = dayjs.utc(this.end_time, "HH:mm").diff(dayjs.utc(this.start_time, "HH:mm"), "second");
// Add 24hours if it is across day
if (duration < 0) {
duration += 24 * 3600;
}
return duration;
}
/**
* Convert data from socket to bean
* @param {Bean} bean Bean to fill in
* @param {Object} obj Data to fill bean with
* @returns {Bean} Filled bean
*/
static jsonToBean(bean, obj) {
if (obj.id) {
bean.id = obj.id;
}
// Apply timezone offset to timeRange, as it cannot apply automatically.
if (obj.timeRange[0]) {
timeObjectToUTC(obj.timeRange[0]);
if (obj.timeRange[1]) {
timeObjectToUTC(obj.timeRange[1]);
}
}
bean.title = obj.title;
bean.description = obj.description;
bean.strategy = obj.strategy;
bean.interval_day = obj.intervalDay;
bean.active = obj.active;
if (obj.dateRange[0]) {
bean.start_date = localToUTC(obj.dateRange[0]);
if (obj.dateRange[1]) {
bean.end_date = localToUTC(obj.dateRange[1]);
}
}
bean.start_time = parseTimeFromTimeObject(obj.timeRange[0]);
bean.end_time = parseTimeFromTimeObject(obj.timeRange[1]);
bean.weekdays = JSON.stringify(obj.weekdays);
bean.days_of_month = JSON.stringify(obj.daysOfMonth);
return bean;
}
/**
* SQL conditions for active maintenance
* @returns {string}
*/
static getActiveMaintenanceSQLCondition() {
return `
(
(maintenance_timeslot.start_date <= DATETIME('now')
AND maintenance_timeslot.end_date >= DATETIME('now')
AND maintenance.active = 1)
OR
(maintenance.strategy = 'manual' AND active = 1)
)
`;
}
/**
* SQL conditions for active and future maintenance
* @returns {string}
*/
static getActiveAndFutureMaintenanceSQLCondition() {
return `
(
((maintenance_timeslot.end_date >= DATETIME('now')
AND maintenance.active = 1)
OR
(maintenance.strategy = 'manual' AND active = 1))
)
`;
}
}
module.exports = Maintenance;

View File

@ -1,198 +0,0 @@
const { BeanModel } = require("redbean-node/dist/bean-model");
const { R } = require("redbean-node");
const dayjs = require("dayjs");
const { log, utcToLocal, SQL_DATETIME_FORMAT_WITHOUT_SECOND, localToUTC } = require("../../src/util");
const { UptimeKumaServer } = require("../uptime-kuma-server");
class MaintenanceTimeslot extends BeanModel {
/**
* Return an object that ready to parse to JSON for public
* Only show necessary data to public
* @returns {Object}
*/
async toPublicJSON() {
const serverTimezoneOffset = UptimeKumaServer.getInstance().getTimezoneOffset();
const obj = {
id: this.id,
startDate: this.start_date,
endDate: this.end_date,
startDateServerTimezone: utcToLocal(this.start_date, SQL_DATETIME_FORMAT_WITHOUT_SECOND),
endDateServerTimezone: utcToLocal(this.end_date, SQL_DATETIME_FORMAT_WITHOUT_SECOND),
serverTimezoneOffset,
};
return obj;
}
/**
* Return an object that ready to parse to JSON
* @returns {Object}
*/
async toJSON() {
return await this.toPublicJSON();
}
/**
* @param {Maintenance} maintenance
* @param {dayjs} minDate (For recurring type only) Generate a next timeslot from this date.
* @param {boolean} removeExist Remove existing timeslot before create
* @returns {Promise<MaintenanceTimeslot>}
*/
static async generateTimeslot(maintenance, minDate = null, removeExist = false) {
if (removeExist) {
await R.exec("DELETE FROM maintenance_timeslot WHERE maintenance_id = ? ", [
maintenance.id
]);
}
if (maintenance.strategy === "manual") {
log.debug("maintenance", "No need to generate timeslot for manual type");
} else if (maintenance.strategy === "single") {
let bean = R.dispense("maintenance_timeslot");
bean.maintenance_id = maintenance.id;
bean.start_date = maintenance.start_date;
bean.end_date = maintenance.end_date;
bean.generated_next = true;
return await R.store(bean);
} else if (maintenance.strategy === "recurring-interval") {
// Prevent dead loop, in case interval_day is not set
if (!maintenance.interval_day || maintenance.interval_day <= 0) {
maintenance.interval_day = 1;
}
return await this.handleRecurringType(maintenance, minDate, (startDateTime) => {
return startDateTime.add(maintenance.interval_day, "day");
}, () => {
return true;
});
} else if (maintenance.strategy === "recurring-weekday") {
let dayOfWeekList = maintenance.getDayOfWeekList();
log.debug("timeslot", dayOfWeekList);
if (dayOfWeekList.length <= 0) {
log.debug("timeslot", "No weekdays selected?");
return null;
}
const isValid = (startDateTime) => {
log.debug("timeslot", "nextDateTime: " + startDateTime);
let day = startDateTime.local().day();
log.debug("timeslot", "nextDateTime.day(): " + day);
return dayOfWeekList.includes(day);
};
return await this.handleRecurringType(maintenance, minDate, (startDateTime) => {
while (true) {
startDateTime = startDateTime.add(1, "day");
if (isValid(startDateTime)) {
return startDateTime;
}
}
}, isValid);
} else if (maintenance.strategy === "recurring-day-of-month") {
let dayOfMonthList = maintenance.getDayOfMonthList();
if (dayOfMonthList.length <= 0) {
log.debug("timeslot", "No day selected?");
return null;
}
const isValid = (startDateTime) => {
let day = parseInt(startDateTime.local().format("D"));
log.debug("timeslot", "day: " + day);
// Check 1-31
if (dayOfMonthList.includes(day)) {
return startDateTime;
}
// Check "lastDay1","lastDay2"...
let daysInMonth = startDateTime.daysInMonth();
let lastDayList = [];
// Small first, e.g. 28 > 29 > 30 > 31
for (let i = 4; i >= 1; i--) {
if (dayOfMonthList.includes("lastDay" + i)) {
lastDayList.push(daysInMonth - i + 1);
}
}
log.debug("timeslot", lastDayList);
return lastDayList.includes(day);
};
return await this.handleRecurringType(maintenance, minDate, (startDateTime) => {
while (true) {
startDateTime = startDateTime.add(1, "day");
if (isValid(startDateTime)) {
return startDateTime;
}
}
}, isValid);
} else {
throw new Error("Unknown maintenance strategy");
}
}
/**
* Generate a next timeslot for all recurring types
* @param maintenance
* @param minDate
* @param {function} nextDayCallback The logic how to get the next possible day
* @param {function} isValidCallback Check the day whether is matched the current strategy
* @returns {Promise<null|MaintenanceTimeslot>}
*/
static async handleRecurringType(maintenance, minDate, nextDayCallback, isValidCallback) {
let bean = R.dispense("maintenance_timeslot");
let duration = maintenance.getDuration();
let startDateTime = maintenance.getStartDateTime();
let endDateTime;
// Keep generating from the first possible date, until it is ok
while (true) {
log.debug("timeslot", "startDateTime: " + startDateTime.format());
// Handling out of effective date range
if (startDateTime.diff(dayjs.utc(maintenance.end_date)) > 0) {
log.debug("timeslot", "Out of effective date range");
return null;
}
endDateTime = startDateTime.add(duration, "second");
// If endDateTime is out of effective date range, use the end datetime from effective date range
if (endDateTime.diff(dayjs.utc(maintenance.end_date)) > 0) {
endDateTime = dayjs.utc(maintenance.end_date);
}
// If minDate is set, the endDateTime must be bigger than it.
// And the endDateTime must be bigger current time
// Is valid under current recurring strategy
if (
(!minDate || endDateTime.diff(minDate) > 0) &&
endDateTime.diff(dayjs()) > 0 &&
isValidCallback(startDateTime)
) {
break;
}
startDateTime = nextDayCallback(startDateTime);
}
bean.maintenance_id = maintenance.id;
bean.start_date = localToUTC(startDateTime);
bean.end_date = localToUTC(endDateTime);
bean.generated_next = false;
return await R.store(bean);
}
}
module.exports = MaintenanceTimeslot;

View File

@ -1,11 +1,13 @@
const https = require("https"); const https = require("https");
const dayjs = require("dayjs"); const dayjs = require("dayjs");
const utc = require("dayjs/plugin/utc");
let timezone = require("dayjs/plugin/timezone");
dayjs.extend(utc);
dayjs.extend(timezone);
const axios = require("axios"); const axios = require("axios");
const { Prometheus } = require("../prometheus"); const { Prometheus } = require("../prometheus");
const { log, UP, DOWN, PENDING, MAINTENANCE, flipStatus, TimeLogger, MAX_INTERVAL_SECOND, MIN_INTERVAL_SECOND } = require("../../src/util"); const { log, UP, DOWN, PENDING, flipStatus, TimeLogger } = require("../../src/util");
const { tcping, ping, dnsResolve, checkCertificate, checkStatusCode, getTotalClientInRoom, setting, mssqlQuery, postgresQuery, mysqlQuery, mqttAsync, setSetting, httpNtlm, radius, grpcQuery, const { tcping, ping, dnsResolve, checkCertificate, checkStatusCode, getTotalClientInRoom, setting, mqttAsync } = require("../util-server");
redisPingAsync, mongodbPing,
} = require("../util-server");
const { R } = require("redbean-node"); const { R } = require("redbean-node");
const { BeanModel } = require("redbean-node/dist/bean-model"); const { BeanModel } = require("redbean-node/dist/bean-model");
const { Notification } = require("../notification"); const { Notification } = require("../notification");
@ -14,17 +16,12 @@ const { demoMode } = require("../config");
const version = require("../../package.json").version; const version = require("../../package.json").version;
const apicache = require("../modules/apicache"); const apicache = require("../modules/apicache");
const { UptimeKumaServer } = require("../uptime-kuma-server"); const { UptimeKumaServer } = require("../uptime-kuma-server");
const { CacheableDnsHttpAgent } = require("../cacheable-dns-http-agent");
const { DockerHost } = require("../docker");
const Maintenance = require("./maintenance");
const { UptimeCacheList } = require("../uptime-cache-list");
/** /**
* status: * status:
* 0 = DOWN * 0 = DOWN
* 1 = UP * 1 = UP
* 2 = PENDING * 2 = PENDING
* 3 = MAINTENANCE
*/ */
class Monitor extends BeanModel { class Monitor extends BeanModel {
@ -37,13 +34,7 @@ class Monitor extends BeanModel {
let obj = { let obj = {
id: this.id, id: this.id,
name: this.name, name: this.name,
sendUrl: this.sendUrl,
}; };
if (this.sendUrl) {
obj.url = this.url;
}
if (showTags) { if (showTags) {
obj.tags = await this.getTags(); obj.tags = await this.getTags();
} }
@ -81,7 +72,6 @@ class Monitor extends BeanModel {
type: this.type, type: this.type,
interval: this.interval, interval: this.interval,
retryInterval: this.retryInterval, retryInterval: this.retryInterval,
resendInterval: this.resendInterval,
keyword: this.keyword, keyword: this.keyword,
expiryNotification: this.isEnabledExpiryNotification(), expiryNotification: this.isEnabledExpiryNotification(),
ignoreTls: this.getIgnoreTls(), ignoreTls: this.getIgnoreTls(),
@ -91,23 +81,13 @@ class Monitor extends BeanModel {
dns_resolve_type: this.dns_resolve_type, dns_resolve_type: this.dns_resolve_type,
dns_resolve_server: this.dns_resolve_server, dns_resolve_server: this.dns_resolve_server,
dns_last_result: this.dns_last_result, dns_last_result: this.dns_last_result,
docker_container: this.docker_container,
docker_host: this.docker_host,
proxyId: this.proxy_id, proxyId: this.proxy_id,
notificationIDList, notificationIDList,
tags: tags, tags: tags,
maintenance: await Monitor.isUnderMaintenance(this.id), mqttUsername: this.mqttUsername,
mqttPassword: this.mqttPassword,
mqttTopic: this.mqttTopic, mqttTopic: this.mqttTopic,
mqttSuccessMessage: this.mqttSuccessMessage, mqttSuccessMessage: this.mqttSuccessMessage
databaseQuery: this.databaseQuery,
authMethod: this.authMethod,
grpcUrl: this.grpcUrl,
grpcProtobuf: this.grpcProtobuf,
grpcMethod: this.grpcMethod,
grpcServiceName: this.grpcServiceName,
grpcEnableTls: this.getGrpcEnableTls(),
radiusCalledStationId: this.radiusCalledStationId,
radiusCallingStationId: this.radiusCallingStationId,
}; };
if (includeSensitiveData) { if (includeSensitiveData) {
@ -115,23 +95,12 @@ class Monitor extends BeanModel {
...data, ...data,
headers: this.headers, headers: this.headers,
body: this.body, body: this.body,
grpcBody: this.grpcBody,
grpcMetadata: this.grpcMetadata,
basic_auth_user: this.basic_auth_user, basic_auth_user: this.basic_auth_user,
basic_auth_pass: this.basic_auth_pass, basic_auth_pass: this.basic_auth_pass,
pushToken: this.pushToken, pushToken: this.pushToken,
databaseConnectionString: this.databaseConnectionString,
radiusUsername: this.radiusUsername,
radiusPassword: this.radiusPassword,
radiusSecret: this.radiusSecret,
mqttUsername: this.mqttUsername,
mqttPassword: this.mqttPassword,
authWorkstation: this.authWorkstation,
authDomain: this.authDomain,
}; };
} }
data.includeSensitiveData = includeSensitiveData;
return data; return data;
} }
@ -176,14 +145,6 @@ class Monitor extends BeanModel {
return Boolean(this.upsideDown); return Boolean(this.upsideDown);
} }
/**
* Parse to boolean
* @returns {boolean}
*/
getGrpcEnableTls() {
return Boolean(this.grpcEnableTls);
}
/** /**
* Get accepted status codes * Get accepted status codes
* @returns {Object} * @returns {Object}
@ -231,9 +192,8 @@ class Monitor extends BeanModel {
let bean = R.dispense("heartbeat"); let bean = R.dispense("heartbeat");
bean.monitor_id = this.id; bean.monitor_id = this.id;
bean.time = R.isoDateTimeMillis(dayjs.utc()); bean.time = R.isoDateTime(dayjs.utc());
bean.status = DOWN; bean.status = DOWN;
bean.downCount = previousBeat?.downCount || 0;
if (this.isUpsideDown()) { if (this.isUpsideDown()) {
bean.status = flipStatus(bean.status); bean.status = flipStatus(bean.status);
@ -247,16 +207,13 @@ class Monitor extends BeanModel {
} }
try { try {
if (await Monitor.isUnderMaintenance(this.id)) { if (this.type === "http" || this.type === "keyword") {
bean.msg = "Monitor under maintenance";
bean.status = MAINTENANCE;
} else if (this.type === "http" || this.type === "keyword") {
// Do not do any queries/high loading things before the "bean.ping" // Do not do any queries/high loading things before the "bean.ping"
let startTime = dayjs().valueOf(); let startTime = dayjs().valueOf();
// HTTP basic auth // HTTP basic auth
let basicAuthHeader = {}; let basicAuthHeader = {};
if (this.auth_method === "basic") { if (this.basic_auth_user) {
basicAuthHeader = { basicAuthHeader = {
"Authorization": "Basic " + this.encodeBase64(this.basic_auth_user, this.basic_auth_pass), "Authorization": "Basic " + this.encodeBase64(this.basic_auth_user, this.basic_auth_pass),
}; };
@ -269,7 +226,6 @@ class Monitor extends BeanModel {
log.debug("monitor", `[${this.name}] Prepare Options for axios`); log.debug("monitor", `[${this.name}] Prepare Options for axios`);
// Axios Options
const options = { const options = {
url: this.url, url: this.url,
method: (this.method || "get").toLowerCase(), method: (this.method || "get").toLowerCase(),
@ -308,9 +264,7 @@ class Monitor extends BeanModel {
log.debug("monitor", `[${this.name}] Axios Options: ${JSON.stringify(options)}`); log.debug("monitor", `[${this.name}] Axios Options: ${JSON.stringify(options)}`);
log.debug("monitor", `[${this.name}] Axios Request`); log.debug("monitor", `[${this.name}] Axios Request`);
// Make Request let res = await axios.request(options);
let res = await this.makeAxiosRequest(options);
bean.msg = `${res.status} - ${res.statusText}`; bean.msg = `${res.status} - ${res.statusText}`;
bean.ping = dayjs().valueOf() - startTime; bean.ping = dayjs().valueOf() - startTime;
@ -358,11 +312,7 @@ class Monitor extends BeanModel {
bean.msg += ", keyword is found"; bean.msg += ", keyword is found";
bean.status = UP; bean.status = UP;
} else { } else {
data = data.replace(/<[^>]*>?|[\n\r]|\s+/gm, " "); throw new Error(bean.msg + ", but keyword is not found");
if (data.length > 50) {
data = data.substring(0, 47) + "...";
}
throw new Error(bean.msg + ", but keyword is not in [" + data + "]");
} }
} }
@ -417,33 +367,22 @@ class Monitor extends BeanModel {
bean.msg = dnsMessage; bean.msg = dnsMessage;
bean.status = UP; bean.status = UP;
} else if (this.type === "push") { // Type: Push } else if (this.type === "push") { // Type: Push
log.debug("monitor", `[${this.name}] Checking monitor at ${dayjs().format("YYYY-MM-DD HH:mm:ss.SSS")}`); const time = R.isoDateTime(dayjs.utc().subtract(this.interval, "second"));
const bufferTime = 1000; // 1s buffer to accommodate clock differences
if (previousBeat) { let heartbeatCount = await R.count("heartbeat", " monitor_id = ? AND time > ? ", [
const msSinceLastBeat = dayjs.utc().valueOf() - dayjs.utc(previousBeat.time).valueOf(); this.id,
time
]);
log.debug("monitor", `[${this.name}] msSinceLastBeat = ${msSinceLastBeat}`); log.debug("monitor", "heartbeatCount" + heartbeatCount + " " + time);
// If the previous beat was down or pending we use the regular if (heartbeatCount <= 0) {
// beatInterval/retryInterval in the setTimeout further below
if (previousBeat.status !== (this.isUpsideDown() ? DOWN : UP) || msSinceLastBeat > beatInterval * 1000 + bufferTime) {
throw new Error("No heartbeat in the time window");
} else {
let timeout = beatInterval * 1000 - msSinceLastBeat;
if (timeout < 0) {
timeout = bufferTime;
} else {
timeout += bufferTime;
}
// No need to insert successful heartbeat for push type, so end here
retries = 0;
log.debug("monitor", `[${this.name}] timeout = ${timeout}`);
this.heartbeatInterval = setTimeout(beat, timeout);
return;
}
} else {
throw new Error("No heartbeat in the time window"); throw new Error("No heartbeat in the time window");
} else {
// No need to insert successful heartbeat for push type, so end here
retries = 0;
this.heartbeatInterval = setTimeout(beat, beatInterval * 1000);
return;
} }
} else if (this.type === "steam") { } else if (this.type === "steam") {
@ -461,13 +400,10 @@ class Monitor extends BeanModel {
"Accept": "*/*", "Accept": "*/*",
"User-Agent": "Uptime-Kuma/" + version, "User-Agent": "Uptime-Kuma/" + version,
}, },
httpsAgent: CacheableDnsHttpAgent.getHttpsAgent({ httpsAgent: new https.Agent({
maxCachedSessions: 0, // Use Custom agent to disable session reuse (https://github.com/nodejs/node/issues/3940) maxCachedSessions: 0, // Use Custom agent to disable session reuse (https://github.com/nodejs/node/issues/3940)
rejectUnauthorized: !this.getIgnoreTls(), rejectUnauthorized: !this.getIgnoreTls(),
}), }),
httpAgent: CacheableDnsHttpAgent.getHttpAgent({
maxCachedSessions: 0,
}),
maxRedirects: this.maxredirects, maxRedirects: this.maxredirects,
validateStatus: (status) => { validateStatus: (status) => {
return checkStatusCode(status, this.getAcceptedStatuscodes()); return checkStatusCode(status, this.getAcceptedStatuscodes());
@ -488,41 +424,6 @@ class Monitor extends BeanModel {
} else { } else {
throw new Error("Server not found on Steam"); throw new Error("Server not found on Steam");
} }
} else if (this.type === "docker") {
log.debug(`[${this.name}] Prepare Options for Axios`);
const dockerHost = await R.load("docker_host", this.docker_host);
const options = {
url: `/containers/${this.docker_container}/json`,
timeout: this.interval * 1000 * 0.8,
headers: {
"Accept": "*/*",
"User-Agent": "Uptime-Kuma/" + version,
},
httpsAgent: CacheableDnsHttpAgent.getHttpsAgent({
maxCachedSessions: 0, // Use Custom agent to disable session reuse (https://github.com/nodejs/node/issues/3940)
rejectUnauthorized: !this.getIgnoreTls(),
}),
httpAgent: CacheableDnsHttpAgent.getHttpAgent({
maxCachedSessions: 0,
}),
};
if (dockerHost._dockerType === "socket") {
options.socketPath = dockerHost._dockerDaemon;
} else if (dockerHost._dockerType === "tcp") {
options.baseURL = DockerHost.patchDockerURL(dockerHost._dockerDaemon);
}
log.debug(`[${this.name}] Axios Request`);
let res = await axios.request(options);
if (res.data.State.Running) {
bean.status = UP;
bean.msg = res.data.State.Status;
} else {
throw Error("Container State is " + res.data.State.Status);
}
} else if (this.type === "mqtt") { } else if (this.type === "mqtt") {
bean.msg = await mqttAsync(this.hostname, this.mqttTopic, this.mqttSuccessMessage, { bean.msg = await mqttAsync(this.hostname, this.mqttTopic, this.mqttSuccessMessage, {
port: this.port, port: this.port,
@ -531,112 +432,6 @@ class Monitor extends BeanModel {
interval: this.interval, interval: this.interval,
}); });
bean.status = UP; bean.status = UP;
} else if (this.type === "sqlserver") {
let startTime = dayjs().valueOf();
await mssqlQuery(this.databaseConnectionString, this.databaseQuery);
bean.msg = "";
bean.status = UP;
bean.ping = dayjs().valueOf() - startTime;
} else if (this.type === "grpc-keyword") {
let startTime = dayjs().valueOf();
const options = {
grpcUrl: this.grpcUrl,
grpcProtobufData: this.grpcProtobuf,
grpcServiceName: this.grpcServiceName,
grpcEnableTls: this.grpcEnableTls,
grpcMethod: this.grpcMethod,
grpcBody: this.grpcBody,
keyword: this.keyword
};
const response = await grpcQuery(options);
bean.ping = dayjs().valueOf() - startTime;
log.debug("monitor:", `gRPC response: ${JSON.stringify(response)}`);
let responseData = response.data;
if (responseData.length > 50) {
responseData = responseData.toString().substring(0, 47) + "...";
}
if (response.code !== 1) {
bean.status = DOWN;
bean.msg = `Error in send gRPC ${response.code} ${response.errorMessage}`;
} else {
if (response.data.toString().includes(this.keyword)) {
bean.status = UP;
bean.msg = `${responseData}, keyword [${this.keyword}] is found`;
} else {
log.debug("monitor:", `GRPC response [${response.data}] + ", but keyword [${this.keyword}] is not in [" + ${response.data} + "]"`);
bean.status = DOWN;
bean.msg = `, but keyword [${this.keyword}] is not in [" + ${responseData} + "]`;
}
}
} else if (this.type === "postgres") {
let startTime = dayjs().valueOf();
await postgresQuery(this.databaseConnectionString, this.databaseQuery);
bean.msg = "";
bean.status = UP;
bean.ping = dayjs().valueOf() - startTime;
} else if (this.type === "mysql") {
let startTime = dayjs().valueOf();
await mysqlQuery(this.databaseConnectionString, this.databaseQuery);
bean.msg = "";
bean.status = UP;
bean.ping = dayjs().valueOf() - startTime;
} else if (this.type === "mongodb") {
let startTime = dayjs().valueOf();
await mongodbPing(this.databaseConnectionString);
bean.msg = "";
bean.status = UP;
bean.ping = dayjs().valueOf() - startTime;
} else if (this.type === "radius") {
let startTime = dayjs().valueOf();
// Handle monitors that were created before the
// update and as such don't have a value for
// this.port.
let port;
if (this.port == null) {
port = 1812;
} else {
port = this.port;
}
try {
const resp = await radius(
this.hostname,
this.radiusUsername,
this.radiusPassword,
this.radiusCalledStationId,
this.radiusCallingStationId,
this.radiusSecret,
port
);
if (resp.code) {
bean.msg = resp.code;
}
bean.status = UP;
} catch (error) {
bean.status = DOWN;
if (error.response?.code) {
bean.msg = error.response.code;
} else {
bean.msg = error.message;
}
}
bean.ping = dayjs().valueOf() - startTime;
} else if (this.type === "redis") {
let startTime = dayjs().valueOf();
bean.msg = await redisPingAsync(this.databaseConnectionString);
bean.status = UP;
bean.ping = dayjs().valueOf() - startTime;
} else { } else {
bean.msg = "Unknown Monitor Type"; bean.msg = "Unknown Monitor Type";
bean.status = PENDING; bean.status = PENDING;
@ -675,53 +470,29 @@ class Monitor extends BeanModel {
if (isImportant) { if (isImportant) {
bean.important = true; bean.important = true;
if (Monitor.isImportantForNotification(isFirstBeat, previousBeat?.status, bean.status)) { log.debug("monitor", `[${this.name}] sendNotification`);
log.debug("monitor", `[${this.name}] sendNotification`); await Monitor.sendNotification(isFirstBeat, this, bean);
await Monitor.sendNotification(isFirstBeat, this, bean);
} else {
log.debug("monitor", `[${this.name}] will not sendNotification because it is (or was) under maintenance`);
}
// Reset down count
bean.downCount = 0;
// Clear Status Page Cache // Clear Status Page Cache
log.debug("monitor", `[${this.name}] apicache clear`); log.debug("monitor", `[${this.name}] apicache clear`);
apicache.clear(); apicache.clear();
UptimeKumaServer.getInstance().sendMaintenanceListByUserID(this.user_id);
} else { } else {
bean.important = false; bean.important = false;
if (bean.status === DOWN && this.resendInterval > 0) {
++bean.downCount;
if (bean.downCount >= this.resendInterval) {
// Send notification again, because we are still DOWN
log.debug("monitor", `[${this.name}] sendNotification again: Down Count: ${bean.downCount} | Resend Interval: ${this.resendInterval}`);
await Monitor.sendNotification(isFirstBeat, this, bean);
// Reset down count
bean.downCount = 0;
}
}
} }
if (bean.status === UP) { if (bean.status === UP) {
log.debug("monitor", `Monitor #${this.id} '${this.name}': Successful Response: ${bean.ping} ms | Interval: ${beatInterval} seconds | Type: ${this.type}`); log.info("monitor", `Monitor #${this.id} '${this.name}': Successful Response: ${bean.ping} ms | Interval: ${beatInterval} seconds | Type: ${this.type}`);
} else if (bean.status === PENDING) { } else if (bean.status === PENDING) {
if (this.retryInterval > 0) { if (this.retryInterval > 0) {
beatInterval = this.retryInterval; beatInterval = this.retryInterval;
} }
log.warn("monitor", `Monitor #${this.id} '${this.name}': Pending: ${bean.msg} | Max retries: ${this.maxretries} | Retry: ${retries} | Retry Interval: ${beatInterval} seconds | Type: ${this.type}`); log.warn("monitor", `Monitor #${this.id} '${this.name}': Pending: ${bean.msg} | Max retries: ${this.maxretries} | Retry: ${retries} | Retry Interval: ${beatInterval} seconds | Type: ${this.type}`);
} else if (bean.status === MAINTENANCE) {
log.warn("monitor", `Monitor #${this.id} '${this.name}': Under Maintenance | Type: ${this.type}`);
} else { } else {
log.warn("monitor", `Monitor #${this.id} '${this.name}': Failing: ${bean.msg} | Interval: ${beatInterval} seconds | Type: ${this.type} | Down Count: ${bean.downCount} | Resend Interval: ${this.resendInterval}`); log.warn("monitor", `Monitor #${this.id} '${this.name}': Failing: ${bean.msg} | Interval: ${beatInterval} seconds | Type: ${this.type}`);
} }
log.debug("monitor", `[${this.name}] Send to socket`); log.debug("monitor", `[${this.name}] Send to socket`);
UptimeCacheList.clearCache(this.id);
io.to(this.user_id).emit("heartbeat", bean.toJSON()); io.to(this.user_id).emit("heartbeat", bean.toJSON());
Monitor.sendStats(io, this.id, this.user_id); Monitor.sendStats(io, this.id, this.user_id);
@ -768,47 +539,6 @@ class Monitor extends BeanModel {
} }
} }
/**
* Make a request using axios
* @param {Object} options Options for Axios
* @param {boolean} finalCall Should this be the final call i.e
* don't retry on faliure
* @returns {Object} Axios response
*/
async makeAxiosRequest(options, finalCall = false) {
try {
let res;
if (this.auth_method === "ntlm") {
options.httpsAgent.keepAlive = true;
res = await httpNtlm(options, {
username: this.basic_auth_user,
password: this.basic_auth_pass,
domain: this.authDomain,
workstation: this.authWorkstation ? this.authWorkstation : undefined
});
} else {
res = await axios.request(options);
}
return res;
} catch (e) {
// Fix #2253
// Read more: https://stackoverflow.com/questions/1759956/curl-error-18-transfer-closed-with-outstanding-read-data-remaining
if (!finalCall && typeof e.message === "string" && e.message.includes("maxContentLength size of -1 exceeded")) {
log.debug("monitor", "makeAxiosRequest with gzip");
options.headers["Accept-Encoding"] = "gzip, deflate";
return this.makeAxiosRequest(options, true);
} else {
if (typeof e.message === "string" && e.message.includes("maxContentLength size of -1 exceeded")) {
e.message = "response timeout: incomplete response within a interval";
}
throw e;
}
}
}
/** Stop monitor */ /** Stop monitor */
stop() { stop() {
clearTimeout(this.heartbeatInterval); clearTimeout(this.heartbeatInterval);
@ -947,15 +677,7 @@ class Monitor extends BeanModel {
* @param {number} duration Hours * @param {number} duration Hours
* @param {number} monitorID ID of monitor to calculate * @param {number} monitorID ID of monitor to calculate
*/ */
static async calcUptime(duration, monitorID, forceNoCache = false) { static async calcUptime(duration, monitorID) {
if (!forceNoCache) {
let cachedUptime = UptimeCacheList.getUptime(monitorID, duration);
if (cachedUptime != null) {
return cachedUptime;
}
}
const timeLogger = new TimeLogger(); const timeLogger = new TimeLogger();
const startTime = R.isoDateTime(dayjs.utc().subtract(duration, "hour")); const startTime = R.isoDateTime(dayjs.utc().subtract(duration, "hour"));
@ -976,7 +698,7 @@ class Monitor extends BeanModel {
-- SUM all uptime duration, also trim off the beat out of time window -- SUM all uptime duration, also trim off the beat out of time window
SUM( SUM(
CASE CASE
WHEN (status = 1 OR status = 3) WHEN (status = 1)
THEN THEN
CASE CASE
WHEN (JULIANDAY(\`time\`) - JULIANDAY(?)) * 86400 < duration WHEN (JULIANDAY(\`time\`) - JULIANDAY(?)) * 86400 < duration
@ -1014,9 +736,6 @@ class Monitor extends BeanModel {
} }
} }
// Cache
UptimeCacheList.addUptime(monitorID, duration, uptime);
return uptime; return uptime;
} }
@ -1050,49 +769,11 @@ class Monitor extends BeanModel {
// DOWN -> PENDING = this case not exists // DOWN -> PENDING = this case not exists
// DOWN -> DOWN = not important // DOWN -> DOWN = not important
// * DOWN -> UP = important // * DOWN -> UP = important
// MAINTENANCE -> MAINTENANCE = not important let isImportant = isFirstBeat ||
// * MAINTENANCE -> UP = important
// * MAINTENANCE -> DOWN = important
// * DOWN -> MAINTENANCE = important
// * UP -> MAINTENANCE = important
return isFirstBeat ||
(previousBeatStatus === DOWN && currentBeatStatus === MAINTENANCE) ||
(previousBeatStatus === UP && currentBeatStatus === MAINTENANCE) ||
(previousBeatStatus === MAINTENANCE && currentBeatStatus === DOWN) ||
(previousBeatStatus === MAINTENANCE && currentBeatStatus === UP) ||
(previousBeatStatus === UP && currentBeatStatus === DOWN) ||
(previousBeatStatus === DOWN && currentBeatStatus === UP) ||
(previousBeatStatus === PENDING && currentBeatStatus === DOWN);
}
/**
* Is this beat important for notifications?
* @param {boolean} isFirstBeat Is this the first beat of this monitor?
* @param {const} previousBeatStatus Status of the previous beat
* @param {const} currentBeatStatus Status of the current beat
* @returns {boolean} True if is an important beat else false
*/
static isImportantForNotification(isFirstBeat, previousBeatStatus, currentBeatStatus) {
// * ? -> ANY STATUS = important [isFirstBeat]
// UP -> PENDING = not important
// * UP -> DOWN = important
// UP -> UP = not important
// PENDING -> PENDING = not important
// * PENDING -> DOWN = important
// PENDING -> UP = not important
// DOWN -> PENDING = this case not exists
// DOWN -> DOWN = not important
// * DOWN -> UP = important
// MAINTENANCE -> MAINTENANCE = not important
// MAINTENANCE -> UP = not important
// * MAINTENANCE -> DOWN = important
// DOWN -> MAINTENANCE = not important
// UP -> MAINTENANCE = not important
return isFirstBeat ||
(previousBeatStatus === MAINTENANCE && currentBeatStatus === DOWN) ||
(previousBeatStatus === UP && currentBeatStatus === DOWN) || (previousBeatStatus === UP && currentBeatStatus === DOWN) ||
(previousBeatStatus === DOWN && currentBeatStatus === UP) || (previousBeatStatus === DOWN && currentBeatStatus === UP) ||
(previousBeatStatus === PENDING && currentBeatStatus === DOWN); (previousBeatStatus === PENDING && currentBeatStatus === DOWN);
return isImportant;
} }
/** /**
@ -1116,13 +797,7 @@ class Monitor extends BeanModel {
for (let notification of notificationList) { for (let notification of notificationList) {
try { try {
// Prevent if the msg is undefined, notifications such as Discord cannot send out. await Notification.send(JSON.parse(notification.config), msg, await monitor.toJSON(false), bean.toJSON());
const heartbeatJSON = bean.toJSON();
if (!heartbeatJSON["msg"]) {
heartbeatJSON["msg"] = "N/A";
}
await Notification.send(JSON.parse(notification.config), msg, await monitor.toJSON(false), heartbeatJSON);
} catch (e) { } catch (e) {
log.error("monitor", "Cannot send notification to " + notification.name); log.error("monitor", "Cannot send notification to " + notification.name);
log.error("monitor", e); log.error("monitor", e);
@ -1151,19 +826,10 @@ class Monitor extends BeanModel {
if (tlsInfoObject && tlsInfoObject.certInfo && tlsInfoObject.certInfo.daysRemaining) { if (tlsInfoObject && tlsInfoObject.certInfo && tlsInfoObject.certInfo.daysRemaining) {
const notificationList = await Monitor.getNotificationList(this); const notificationList = await Monitor.getNotificationList(this);
let notifyDays = await setting("tlsExpiryNotifyDays"); log.debug("monitor", "call sendCertNotificationByTargetDays");
if (notifyDays == null || !Array.isArray(notifyDays)) { await this.sendCertNotificationByTargetDays(tlsInfoObject.certInfo.daysRemaining, 21, notificationList);
// Reset Default await this.sendCertNotificationByTargetDays(tlsInfoObject.certInfo.daysRemaining, 14, notificationList);
setSetting("tlsExpiryNotifyDays", [ 7, 14, 21 ], "general"); await this.sendCertNotificationByTargetDays(tlsInfoObject.certInfo.daysRemaining, 7, notificationList);
notifyDays = [ 7, 14, 21 ];
}
if (notifyDays != null && Array.isArray(notifyDays)) {
for (const day of notifyDays) {
log.debug("monitor", "call sendCertNotificationByTargetDays", day);
await this.sendCertNotificationByTargetDays(tlsInfoObject.certInfo.daysRemaining, day, notificationList);
}
}
} }
} }
@ -1235,36 +901,6 @@ class Monitor extends BeanModel {
monitorID monitorID
]); ]);
} }
/**
* Check if monitor is under maintenance
* @param {number} monitorID ID of monitor to check
* @returns {Promise<boolean>}
*/
static async isUnderMaintenance(monitorID) {
let activeCondition = Maintenance.getActiveMaintenanceSQLCondition();
const maintenance = await R.getRow(`
SELECT COUNT(*) AS count
FROM monitor_maintenance mm
JOIN maintenance
ON mm.maintenance_id = maintenance.id
AND mm.monitor_id = ?
LEFT JOIN maintenance_timeslot
ON maintenance_timeslot.maintenance_id = maintenance.id
WHERE ${activeCondition}
LIMIT 1`, [ monitorID ]);
return maintenance.count !== 0;
}
/** Make sure monitor interval is between bounds */
validate() {
if (this.interval > MAX_INTERVAL_SECOND) {
throw new Error(`Interval cannot be more than ${MAX_INTERVAL_SECOND} seconds`);
}
if (this.interval < MIN_INTERVAL_SECOND) {
throw new Error(`Interval cannot be less than ${MIN_INTERVAL_SECOND} seconds`);
}
}
} }
module.exports = Monitor; module.exports = Monitor;

View File

@ -1,120 +1,10 @@
const { BeanModel } = require("redbean-node/dist/bean-model"); const { BeanModel } = require("redbean-node/dist/bean-model");
const { R } = require("redbean-node"); const { R } = require("redbean-node");
const cheerio = require("cheerio");
const { UptimeKumaServer } = require("../uptime-kuma-server");
const jsesc = require("jsesc");
const Maintenance = require("./maintenance");
class StatusPage extends BeanModel { class StatusPage extends BeanModel {
/**
* Like this: { "test-uptime.kuma.pet": "default" }
* @type {{}}
*/
static domainMappingList = { }; static domainMappingList = { };
/**
*
* @param {Response} response
* @param {string} indexHTML
* @param {string} slug
*/
static async handleStatusPageResponse(response, indexHTML, slug) {
let statusPage = await R.findOne("status_page", " slug = ? ", [
slug
]);
if (statusPage) {
response.send(await StatusPage.renderHTML(indexHTML, statusPage));
} else {
response.status(404).send(UptimeKumaServer.getInstance().indexHTML);
}
}
/**
* SSR for status pages
* @param {string} indexHTML
* @param {StatusPage} statusPage
*/
static async renderHTML(indexHTML, statusPage) {
const $ = cheerio.load(indexHTML);
const description155 = statusPage.description?.substring(0, 155) ?? "";
$("title").text(statusPage.title);
$("meta[name=description]").attr("content", description155);
if (statusPage.icon) {
$("link[rel=icon]")
.attr("href", statusPage.icon)
.removeAttr("type");
$("link[rel=apple-touch-icon]").remove();
}
const head = $("head");
// OG Meta Tags
head.append(`<meta property="og:title" content="${statusPage.title}" />`);
head.append(`<meta property="og:description" content="${description155}" />`);
// Preload data
// Add jsesc, fix https://github.com/louislam/uptime-kuma/issues/2186
const escapedJSONObject = jsesc(await StatusPage.getStatusPageData(statusPage), {
"isScriptContext": true
});
const script = $(`
<script id="preload-data" data-json="{}">
window.preloadData = ${escapedJSONObject};
</script>
`);
head.append(script);
// manifest.json
$("link[rel=manifest]").attr("href", `/api/status-page/${statusPage.slug}/manifest.json`);
return $.root().html();
}
/**
* Get all status page data in one call
* @param {StatusPage} statusPage
*/
static async getStatusPageData(statusPage) {
// Incident
let incident = await R.findOne("incident", " pin = 1 AND active = 1 AND status_page_id = ? ", [
statusPage.id,
]);
if (incident) {
incident = incident.toPublicJSON();
}
let maintenanceList = await StatusPage.getMaintenanceList(statusPage.id);
// Public Group List
const publicGroupList = [];
const showTags = !!statusPage.show_tags;
const list = await R.find("group", " public = 1 AND status_page_id = ? ORDER BY weight ", [
statusPage.id
]);
for (let groupBean of list) {
let monitorGroup = await groupBean.toPublicJSON(showTags);
publicGroupList.push(monitorGroup);
}
// Response
return {
config: await statusPage.toPublicJSON(),
incident,
publicGroupList,
maintenanceList,
};
}
/** /**
* Loads domain mapping from DB * Loads domain mapping from DB
* Return object like this: { "test-uptime.kuma.pet": "default" } * Return object like this: { "test-uptime.kuma.pet": "default" }
@ -270,38 +160,6 @@ class StatusPage extends BeanModel {
} }
} }
/**
* Get list of maintenances
* @param {number} statusPageId ID of status page to get maintenance for
* @returns {Object} Object representing maintenances sanitized for public
*/
static async getMaintenanceList(statusPageId) {
try {
const publicMaintenanceList = [];
let activeCondition = Maintenance.getActiveMaintenanceSQLCondition();
let maintenanceBeanList = R.convertToBeans("maintenance", await R.getAll(`
SELECT DISTINCT maintenance.*
FROM maintenance
JOIN maintenance_status_page
ON maintenance_status_page.maintenance_id = maintenance.id
AND maintenance_status_page.status_page_id = ?
LEFT JOIN maintenance_timeslot
ON maintenance_timeslot.maintenance_id = maintenance.id
WHERE ${activeCondition}
ORDER BY maintenance.end_date
`, [ statusPageId ]));
for (const bean of maintenanceBeanList) {
publicMaintenanceList.push(await bean.toPublicJSON());
}
return publicMaintenanceList;
} catch (error) {
return [];
}
}
} }
module.exports = StatusPage; module.exports = StatusPage;

View File

@ -1,20 +0,0 @@
import { PluginFunc, ConfigType } from 'dayjs'
declare const plugin: PluginFunc
export = plugin
declare module 'dayjs' {
interface Dayjs {
tz(timezone?: string, keepLocalTime?: boolean): Dayjs
offsetName(type?: 'short' | 'long'): string | undefined
}
interface DayjsTimezone {
(date: ConfigType, timezone?: string): Dayjs
(date: ConfigType, format: string, timezone?: string): Dayjs
guess(): string
setDefault(timezone?: string): void
}
const tz: DayjsTimezone
}

View File

@ -1,115 +0,0 @@
/**
* Copy from node_modules/dayjs/plugin/timezone.js
* Try to fix https://github.com/louislam/uptime-kuma/issues/2318
* Source: https://github.com/iamkun/dayjs/tree/dev/src/plugin/utc
* License: MIT
*/
!function (t, e) {
// eslint-disable-next-line no-undef
typeof exports == "object" && typeof module != "undefined" ? module.exports = e() : typeof define == "function" && define.amd ? define(e) : (t = typeof globalThis != "undefined" ? globalThis : t || self).dayjs_plugin_timezone = e();
}(this, (function () {
"use strict";
let t = {
year: 0,
month: 1,
day: 2,
hour: 3,
minute: 4,
second: 5
};
let e = {};
return function (n, i, o) {
let r;
let a = function (t, n, i) {
void 0 === i && (i = {});
let o = new Date(t);
let r = function (t, n) {
void 0 === n && (n = {});
let i = n.timeZoneName || "short";
let o = t + "|" + i;
let r = e[o];
return r || (r = new Intl.DateTimeFormat("en-US", {
hour12: !1,
timeZone: t,
year: "numeric",
month: "2-digit",
day: "2-digit",
hour: "2-digit",
minute: "2-digit",
second: "2-digit",
timeZoneName: i
}), e[o] = r), r;
}(n, i);
return r.formatToParts(o);
};
let u = function (e, n) {
let i = a(e, n);
let r = [];
let u = 0;
for (; u < i.length; u += 1) {
let f = i[u];
let s = f.type;
let m = f.value;
let c = t[s];
c >= 0 && (r[c] = parseInt(m, 10));
}
let d = r[3];
let l = d === 24 ? 0 : d;
let v = r[0] + "-" + r[1] + "-" + r[2] + " " + l + ":" + r[4] + ":" + r[5] + ":000";
let h = +e;
return (o.utc(v).valueOf() - (h -= h % 1e3)) / 6e4;
};
let f = i.prototype;
f.tz = function (t, e) {
void 0 === t && (t = r);
let n = this.utcOffset();
let i = this.toDate();
let a = i.toLocaleString("en-US", { timeZone: t }).replace("\u202f", " ");
let u = Math.round((i - new Date(a)) / 1e3 / 60);
let f = o(a).$set("millisecond", this.$ms).utcOffset(15 * -Math.round(i.getTimezoneOffset() / 15) - u, !0);
if (e) {
let s = f.utcOffset();
f = f.add(n - s, "minute");
}
return f.$x.$timezone = t, f;
}, f.offsetName = function (t) {
let e = this.$x.$timezone || o.tz.guess();
let n = a(this.valueOf(), e, { timeZoneName: t }).find((function (t) {
return t.type.toLowerCase() === "timezonename";
}));
return n && n.value;
};
let s = f.startOf;
f.startOf = function (t, e) {
if (!this.$x || !this.$x.$timezone) {
return s.call(this, t, e);
}
let n = o(this.format("YYYY-MM-DD HH:mm:ss:SSS"));
return s.call(n, t, e).tz(this.$x.$timezone, !0);
}, o.tz = function (t, e, n) {
let i = n && e;
let a = n || e || r;
let f = u(+o(), a);
if (typeof t != "string") {
return o(t).tz(a);
}
let s = function (t, e, n) {
let i = t - 60 * e * 1e3;
let o = u(i, n);
if (e === o) {
return [ i, e ];
}
let r = u(i -= 60 * (o - e) * 1e3, n);
return o === r ? [ i, o ] : [ t - 60 * Math.min(o, r) * 1e3, Math.max(o, r) ];
}(o.utc(t, i).valueOf(), f, a);
let m = s[0];
let c = s[1];
let d = o(m).utcOffset(c);
return d.$x.$timezone = a, d;
}, o.tz.guess = function () {
return Intl.DateTimeFormat().resolvedOptions().timeZone;
}, o.tz.setDefault = function (t) {
r = t;
};
};
}));

View File

@ -1,50 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { setting } = require("../util-server");
const { getMonitorRelativeURL, UP, DOWN } = require("../../src/util");
class AlertNow extends NotificationProvider {
name = "AlertNow";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let textMsg = "";
let status = "open";
let eventType = "ERROR";
let eventId = new Date().toISOString().slice(0, 10).replace(/-/g, "");
if (heartbeatJSON && heartbeatJSON.status === UP) {
textMsg = `[${heartbeatJSON.name}] ✅ Application is back online`;
status = "close";
eventType = "INFO";
eventId += `_${heartbeatJSON.name.replace(/\s/g, "")}`;
} else if (heartbeatJSON && heartbeatJSON.status === DOWN) {
textMsg = `[${heartbeatJSON.name}] 🔴 Application went down`;
}
textMsg += ` - ${msg}`;
const baseURL = await setting("primaryBaseURL");
if (baseURL && monitorJSON) {
textMsg += ` >> ${baseURL + getMonitorRelativeURL(monitorJSON.id)}`;
}
const data = {
"summary": textMsg,
"status": status,
"event_type": eventType,
"event_id": eventId,
};
await axios.post(notification.alertNowWebhookURL, data);
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = AlertNow;

View File

@ -12,7 +12,9 @@ const { default: axios } = require("axios");
// bark is an APN bridge that sends notifications to Apple devices. // bark is an APN bridge that sends notifications to Apple devices.
const barkNotificationGroup = "UptimeKuma";
const barkNotificationAvatar = "https://github.com/louislam/uptime-kuma/raw/master/public/icon.png"; const barkNotificationAvatar = "https://github.com/louislam/uptime-kuma/raw/master/public/icon.png";
const barkNotificationSound = "telegraph";
const successMessage = "Successes!"; const successMessage = "Successes!";
class Bark extends NotificationProvider { class Bark extends NotificationProvider {
@ -28,17 +30,17 @@ class Bark extends NotificationProvider {
if (msg != null && heartbeatJSON != null && heartbeatJSON["status"] === UP) { if (msg != null && heartbeatJSON != null && heartbeatJSON["status"] === UP) {
let title = "UptimeKuma Monitor Up"; let title = "UptimeKuma Monitor Up";
return await this.postNotification(notification, title, msg, barkEndpoint); return await this.postNotification(title, msg, barkEndpoint);
} }
if (msg != null && heartbeatJSON != null && heartbeatJSON["status"] === DOWN) { if (msg != null && heartbeatJSON != null && heartbeatJSON["status"] === DOWN) {
let title = "UptimeKuma Monitor Down"; let title = "UptimeKuma Monitor Down";
return await this.postNotification(notification, title, msg, barkEndpoint); return await this.postNotification(title, msg, barkEndpoint);
} }
if (msg != null) { if (msg != null) {
let title = "UptimeKuma Message"; let title = "UptimeKuma Message";
return await this.postNotification(notification, title, msg, barkEndpoint); return await this.postNotification(title, msg, barkEndpoint);
} }
} }
@ -48,23 +50,13 @@ class Bark extends NotificationProvider {
* @param {string} postUrl URL to append parameters to * @param {string} postUrl URL to append parameters to
* @returns {string} * @returns {string}
*/ */
appendAdditionalParameters(notification, postUrl) { appendAdditionalParameters(postUrl) {
// set icon to uptime kuma icon, 11kb should be fine
postUrl += "?icon=" + barkNotificationAvatar;
// grouping all our notifications // grouping all our notifications
if (notification.barkGroup != null) { postUrl += "?group=" + barkNotificationGroup;
postUrl += "&group=" + notification.barkGroup; // set icon to uptime kuma icon, 11kb should be fine
} else { postUrl += "&icon=" + barkNotificationAvatar;
// default name
postUrl += "&group=" + "UptimeKuma";
}
// picked a sound, this should follow system's mute status when arrival // picked a sound, this should follow system's mute status when arrival
if (notification.barkSound != null) { postUrl += "&sound=" + barkNotificationSound;
postUrl += "&sound=" + notification.barkSound;
} else {
// default sound
postUrl += "&sound=" + "telegraph";
}
return postUrl; return postUrl;
} }
@ -89,12 +81,12 @@ class Bark extends NotificationProvider {
* @param {string} endpoint Endpoint to send request to * @param {string} endpoint Endpoint to send request to
* @returns {string} * @returns {string}
*/ */
async postNotification(notification, title, subtitle, endpoint) { async postNotification(title, subtitle, endpoint) {
// url encode title and subtitle // url encode title and subtitle
title = encodeURIComponent(title); title = encodeURIComponent(title);
subtitle = encodeURIComponent(subtitle); subtitle = encodeURIComponent(subtitle);
let postUrl = endpoint + "/" + title + "/" + subtitle; let postUrl = endpoint + "/" + title + "/" + subtitle;
postUrl = this.appendAdditionalParameters(notification, postUrl); postUrl = this.appendAdditionalParameters(postUrl);
let result = await axios.get(postUrl); let result = await axios.get(postUrl);
this.checkResult(result); this.checkResult(result);
if (result.statusText != null) { if (result.statusText != null) {

View File

@ -64,7 +64,7 @@ class Discord extends NotificationProvider {
}, },
{ {
name: "Error", name: "Error",
value: heartbeatJSON["msg"] == null ? "N/A" : heartbeatJSON["msg"], value: heartbeatJSON["msg"],
}, },
], ],
}], }],
@ -91,7 +91,7 @@ class Discord extends NotificationProvider {
}, },
{ {
name: monitorJSON["type"] === "push" ? "Service Type" : "Service URL", name: monitorJSON["type"] === "push" ? "Service Type" : "Service URL",
value: monitorJSON["type"] === "push" ? "Heartbeat" : address, value: monitorJSON["type"] === "push" ? "Heartbeat" : address.startsWith("http") ? "[Visit Service](" + address + ")" : address,
}, },
{ {
name: "Time (UTC)", name: "Time (UTC)",

View File

@ -1,24 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
class FreeMobile extends NotificationProvider {
name = "FreeMobile";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
await axios.post(`https://smsapi.free-mobile.fr/sendmsg?msg=${encodeURIComponent(msg.replace("🔴", "⛔️"))}`, {
"user": notification.freemobileUser,
"pass": notification.freemobilePass,
});
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = FreeMobile;

View File

@ -1,35 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { UP } = require("../../src/util");
class GoAlert extends NotificationProvider {
name = "GoAlert";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let closeAction = "close";
let data = {
summary: msg,
};
if (heartbeatJSON != null && heartbeatJSON["status"] === UP) {
data["action"] = closeAction;
}
let headers = {
"Content-Type": "multipart/form-data",
};
let config = {
headers: headers
};
await axios.post(`${notification.goAlertBaseURL}/api/v2/generic/incoming?token=${notification.goAlertToken}`, data, config);
return okMsg;
} catch (error) {
let msg = (error.response.data) ? error.response.data : "Error without response";
throw new Error(msg);
}
}
}
module.exports = GoAlert;

View File

@ -1,38 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const defaultNotificationService = "notify";
class HomeAssistant extends NotificationProvider {
name = "HomeAssistant";
async send(notification, message, monitor = null, heartbeat = null) {
const notificationService = notification?.notificationService || defaultNotificationService;
try {
await axios.post(
`${notification.homeAssistantUrl}/api/services/notify/${notificationService}`,
{
title: "Uptime Kuma",
message,
...(notificationService !== "persistent_notification" && { data: {
name: monitor?.name,
status: heartbeat?.status,
} }),
},
{
headers: {
Authorization: `Bearer ${notification.longLivedAccessToken}`,
"Content-Type": "application/json",
},
}
);
return "Sent Successfully.";
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = HomeAssistant;

View File

@ -1,31 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
class Kook extends NotificationProvider {
name = "Kook";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
let url = "https://www.kookapp.cn/api/v3/message/create";
let data = {
target_id: notification.kookGuildID,
content: msg,
};
let config = {
headers: {
"Authorization": "Bot " + notification.kookBotToken,
"Content-Type": "application/json",
},
};
try {
await axios.post(url, data, config);
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = Kook;

View File

@ -1,43 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const qs = require("qs");
const { DOWN, UP } = require("../../src/util");
class LineNotify extends NotificationProvider {
name = "LineNotify";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let lineAPIUrl = "https://notify-api.line.me/api/notify";
let config = {
headers: {
"Content-Type": "application/x-www-form-urlencoded",
"Authorization": "Bearer " + notification.lineNotifyAccessToken
}
};
if (heartbeatJSON == null) {
let testMessage = {
"message": msg,
};
await axios.post(lineAPIUrl, qs.stringify(testMessage), config);
} else if (heartbeatJSON["status"] === DOWN) {
let downMessage = {
"message": "\n[🔴 Down]\n" + "Name: " + monitorJSON["name"] + " \n" + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"]
};
await axios.post(lineAPIUrl, qs.stringify(downMessage), config);
} else if (heartbeatJSON["status"] === UP) {
let upMessage = {
"message": "\n[✅ Up]\n" + "Name: " + monitorJSON["name"] + " \n" + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"]
};
await axios.post(lineAPIUrl, qs.stringify(upMessage), config);
}
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = LineNotify;

View File

@ -14,7 +14,7 @@ class LunaSea extends NotificationProvider {
if (heartbeatJSON == null) { if (heartbeatJSON == null) {
let testdata = { let testdata = {
"title": "Uptime Kuma Alert", "title": "Uptime Kuma Alert",
"body": msg, "body": "Testing Successful.",
}; };
await axios.post(lunaseadevice, testdata); await axios.post(lunaseadevice, testdata);
return okMsg; return okMsg;

View File

@ -1,38 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
class Ntfy extends NotificationProvider {
name = "ntfy";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let headers = {};
if (notification.ntfyusername) {
headers = {
"Authorization": "Basic " + Buffer.from(notification.ntfyusername + ":" + notification.ntfypassword).toString("base64"),
};
}
let data = {
"topic": notification.ntfytopic,
"message": msg,
"priority": notification.ntfyPriority || 4,
"title": "Uptime-Kuma",
};
if (notification.ntfyIcon) {
data.icon = notification.ntfyIcon;
}
await axios.post(`${notification.ntfyserverurl}`, data, { headers: headers });
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = Ntfy;

View File

@ -10,7 +10,7 @@ class Octopush extends NotificationProvider {
try { try {
// Default - V2 // Default - V2
if (notification.octopushVersion === "2" || !notification.octopushVersion) { if (notification.octopushVersion === 2 || !notification.octopushVersion) {
let config = { let config = {
headers: { headers: {
"api-key": notification.octopushAPIKey, "api-key": notification.octopushAPIKey,
@ -31,7 +31,7 @@ class Octopush extends NotificationProvider {
"sender": notification.octopushSenderName "sender": notification.octopushSenderName
}; };
await axios.post("https://api.octopush.com/v1/public/sms-campaign/send", data, config); await axios.post("https://api.octopush.com/v1/public/sms-campaign/send", data, config);
} else if (notification.octopushVersion === "1") { } else if (notification.octopushVersion === 1) {
let data = { let data = {
"user_login": notification.octopushDMLogin, "user_login": notification.octopushDMLogin,
"api_key": notification.octopushDMAPIKey, "api_key": notification.octopushDMAPIKey,
@ -49,15 +49,7 @@ class Octopush extends NotificationProvider {
}, },
params: data params: data
}; };
await axios.post("https://www.octopush-dm.com/api/sms/json", {}, config);
// V1 API returns 200 even on error so we must check
// response data
let response = await axios.post("https://www.octopush-dm.com/api/sms/json", {}, config);
if ("error_code" in response.data) {
if (response.data.error_code !== "000") {
this.throwGeneralAxiosError(`Octopush error ${JSON.stringify(response.data)}`);
}
}
} else { } else {
throw new Error("Unknown Octopush version!"); throw new Error("Unknown Octopush version!");
} }

View File

@ -8,14 +8,6 @@ class PromoSMS extends NotificationProvider {
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) { async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully."; let okMsg = "Sent Successfully.";
if (notification.promosmsAllowLongSMS === undefined) {
notification.promosmsAllowLongSMS = false;
}
//TODO: Add option for enabling special characters. It will decrese message max length from 160 to 70 chars.
//Lets remove non ascii char
let cleanMsg = msg.replace(/[^\x00-\x7F]/g, "");
try { try {
let config = { let config = {
headers: { headers: {
@ -26,9 +18,8 @@ class PromoSMS extends NotificationProvider {
}; };
let data = { let data = {
"recipients": [ notification.promosmsPhoneNumber ], "recipients": [ notification.promosmsPhoneNumber ],
//Trim message to maximum length of 1 SMS or 4 if we allowed long messages //Lets remove non ascii char
"text": notification.promosmsAllowLongSMS ? cleanMsg.substring(0, 639) : cleanMsg.substring(0, 159), "text": msg.replace(/[^\x00-\x7F]/g, ""),
"long-sms": notification.promosmsAllowLongSMS,
"type": Number(notification.promosmsSMSType), "type": Number(notification.promosmsSMSType),
"sender": notification.promosmsSenderName "sender": notification.promosmsSenderName
}; };

View File

@ -19,26 +19,26 @@ class Pushbullet extends NotificationProvider {
} }
}; };
if (heartbeatJSON == null) { if (heartbeatJSON == null) {
let data = { let testdata = {
"type": "note", "type": "note",
"title": "Uptime Kuma Alert", "title": "Uptime Kuma Alert",
"body": msg, "body": "Testing Successful.",
}; };
await axios.post(pushbulletUrl, data, config); await axios.post(pushbulletUrl, testdata, config);
} else if (heartbeatJSON["status"] === DOWN) { } else if (heartbeatJSON["status"] === DOWN) {
let downData = { let downdata = {
"type": "note", "type": "note",
"title": "UptimeKuma Alert: " + monitorJSON["name"], "title": "UptimeKuma Alert: " + monitorJSON["name"],
"body": "[🔴 Down] " + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"], "body": "[🔴 Down] " + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"],
}; };
await axios.post(pushbulletUrl, downData, config); await axios.post(pushbulletUrl, downdata, config);
} else if (heartbeatJSON["status"] === UP) { } else if (heartbeatJSON["status"] === UP) {
let upData = { let updata = {
"type": "note", "type": "note",
"title": "UptimeKuma Alert: " + monitorJSON["name"], "title": "UptimeKuma Alert: " + monitorJSON["name"],
"body": "[✅ Up] " + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"], "body": "[✅ Up] " + heartbeatJSON["msg"] + "\nTime (UTC): " + heartbeatJSON["time"],
}; };
await axios.post(pushbulletUrl, upData, config); await axios.post(pushbulletUrl, updata, config);
} }
return okMsg; return okMsg;
} catch (error) { } catch (error) {

View File

@ -10,7 +10,7 @@ class Pushover extends NotificationProvider {
let pushoverlink = "https://api.pushover.net/1/messages.json"; let pushoverlink = "https://api.pushover.net/1/messages.json";
let data = { let data = {
"message": "<b>Message</b>:" + msg, "message": "<b>Uptime Kuma Alert</b>\n\n<b>Message</b>:" + msg,
"user": notification.pushoveruserkey, "user": notification.pushoveruserkey,
"token": notification.pushoverapptoken, "token": notification.pushoverapptoken,
"sound": notification.pushoversounds, "sound": notification.pushoversounds,

View File

@ -1,42 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { DOWN, UP } = require("../../src/util");
class ServerChan extends NotificationProvider {
name = "ServerChan";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
await axios.post(`https://sctapi.ftqq.com/${notification.serverChanSendKey}.send`, {
"title": this.checkStatus(heartbeatJSON, monitorJSON),
"desp": msg,
});
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
/**
* Get the formatted title for message
* @param {?Object} monitorJSON Monitor details (For Up/Down only)
* @param {?Object} heartbeatJSON Heartbeat details (For Up/Down only)
* @returns {string} Formatted title
*/
checkStatus(heartbeatJSON, monitorJSON) {
let title = "UptimeKuma Message";
if (heartbeatJSON != null && heartbeatJSON["status"] === UP) {
title = "UptimeKuma Monitor Up " + monitorJSON["name"];
}
if (heartbeatJSON != null && heartbeatJSON["status"] === DOWN) {
title = "UptimeKuma Monitor Down " + monitorJSON["name"];
}
return title;
}
}
module.exports = ServerChan;

View File

@ -1,7 +1,7 @@
const NotificationProvider = require("./notification-provider"); const NotificationProvider = require("./notification-provider");
const axios = require("axios"); const axios = require("axios");
const { setSettings, setting } = require("../util-server"); const { setSettings, setting } = require("../util-server");
const { getMonitorRelativeURL, UP } = require("../../src/util"); const { getMonitorRelativeURL } = require("../../src/util");
class Slack extends NotificationProvider { class Slack extends NotificationProvider {
@ -46,31 +46,24 @@ class Slack extends NotificationProvider {
"channel": notification.slackchannel, "channel": notification.slackchannel,
"username": notification.slackusername, "username": notification.slackusername,
"icon_emoji": notification.slackiconemo, "icon_emoji": notification.slackiconemo,
"attachments": [ "blocks": [{
"type": "header",
"text": {
"type": "plain_text",
"text": "Uptime Kuma Alert",
},
},
{
"type": "section",
"fields": [{
"type": "mrkdwn",
"text": "*Message*\n" + msg,
},
{ {
"color": (heartbeatJSON["status"] === UP) ? "#2eb886" : "#e01e5a", "type": "mrkdwn",
"blocks": [ "text": "*Time (UTC)*\n" + time,
{ }],
"type": "header", }],
"text": {
"type": "plain_text",
"text": "Uptime Kuma Alert",
},
},
{
"type": "section",
"fields": [{
"type": "mrkdwn",
"text": "*Message*\n" + msg,
},
{
"type": "mrkdwn",
"text": "*Time (UTC)*\n" + time,
}],
}
],
}
]
}; };
if (notification.slackbutton) { if (notification.slackbutton) {
@ -81,19 +74,17 @@ class Slack extends NotificationProvider {
// Button // Button
if (baseURL) { if (baseURL) {
data.attachments.forEach(element => { data.blocks.push({
element.blocks.push({ "type": "actions",
"type": "actions", "elements": [{
"elements": [{ "type": "button",
"type": "button", "text": {
"text": { "type": "plain_text",
"type": "plain_text", "text": "Visit Uptime Kuma",
"text": "Visit Uptime Kuma", },
}, "value": "Uptime-Kuma",
"value": "Uptime-Kuma", "url": baseURL + getMonitorRelativeURL(monitorJSON.id),
"url": baseURL + getMonitorRelativeURL(monitorJSON.id), }],
}],
});
}); });
} }

View File

@ -1,71 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
class SMSEagle extends NotificationProvider {
name = "SMSEagle";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let config = {
headers: {
"Content-Type": "application/json",
}
};
let postData;
let sendMethod;
let recipientType;
let encoding = (notification.smseagleEncoding) ? "1" : "0";
let priority = (notification.smseaglePriority) ? notification.smseaglePriority : "0";
if (notification.smseagleRecipientType === "smseagle-contact") {
recipientType = "contactname";
sendMethod = "sms.send_tocontact";
}
if (notification.smseagleRecipientType === "smseagle-group") {
recipientType = "groupname";
sendMethod = "sms.send_togroup";
}
if (notification.smseagleRecipientType === "smseagle-to") {
recipientType = "to";
sendMethod = "sms.send_sms";
}
let params = {
access_token: notification.smseagleToken,
[recipientType]: notification.smseagleRecipient,
message: msg,
responsetype: "extended",
unicode: encoding,
highpriority: priority
};
postData = {
method: sendMethod,
params: params
};
let resp = await axios.post(notification.smseagleUrl + "/jsonrpc/sms", postData, config);
if ((JSON.stringify(resp.data)).indexOf("message_id") === -1) {
let error = "";
if (resp.data.result && resp.data.result.error_text) {
error = `SMSEagle API returned error: ${JSON.stringify(resp.data.result.error_text)}`;
} else {
error = "SMSEagle API returned an unexpected response";
}
throw new Error(error);
}
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = SMSEagle;

View File

@ -1,25 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
class SMSManager extends NotificationProvider {
name = "SMSManager";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
try {
let data = {
apikey: notification.smsmanagerApiKey,
endpoint: "https://http-api.smsmanager.cz/Send",
message: msg.replace(/[^\x00-\x7F]/g, ""),
to: notification.numbers,
messageType: notification.messageType,
};
await axios.get(`${data.endpoint}?apikey=${data.apikey}&message=${data.message}&number=${data.to}&gateway=${data.messageType}`);
return "SMS sent sucessfully.";
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = SMSManager;

View File

@ -1,113 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { UP, DOWN, getMonitorRelativeURL } = require("../../src/util");
const { setting } = require("../util-server");
let successMessage = "Sent Successfully.";
class Splunk extends NotificationProvider {
name = "Splunk";
/**
* @inheritdoc
*/
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
try {
if (heartbeatJSON == null) {
const title = "Uptime Kuma Alert";
const monitor = {
type: "ping",
url: "Uptime Kuma Test Button",
};
return this.postNotification(notification, title, msg, monitor, "trigger");
}
if (heartbeatJSON.status === UP) {
const title = "Uptime Kuma Monitor ✅ Up";
return this.postNotification(notification, title, heartbeatJSON.msg, monitorJSON, "recovery");
}
if (heartbeatJSON.status === DOWN) {
const title = "Uptime Kuma Monitor 🔴 Down";
return this.postNotification(notification, title, heartbeatJSON.msg, monitorJSON, "trigger");
}
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
/**
* Check if result is successful, result code should be in range 2xx
* @param {Object} result Axios response object
* @throws {Error} The status code is not in range 2xx
*/
checkResult(result) {
if (result.status == null) {
throw new Error("Splunk notification failed with invalid response!");
}
if (result.status < 200 || result.status >= 300) {
throw new Error("Splunk notification failed with status code " + result.status);
}
}
/**
* Send the message
* @param {BeanModel} notification Message title
* @param {string} title Message title
* @param {string} body Message
* @param {Object} monitorInfo Monitor details (For Up/Down only)
* @param {?string} eventAction Action event for PagerDuty (trigger, acknowledge, resolve)
* @returns {string}
*/
async postNotification(notification, title, body, monitorInfo, eventAction = "trigger") {
let monitorUrl;
if (monitorInfo.type === "port") {
monitorUrl = monitorInfo.hostname;
if (monitorInfo.port) {
monitorUrl += ":" + monitorInfo.port;
}
} else if (monitorInfo.hostname != null) {
monitorUrl = monitorInfo.hostname;
} else {
monitorUrl = monitorInfo.url;
}
if (eventAction === "recovery") {
if (notification.splunkAutoResolve === "0") {
return "No action required";
}
eventAction = notification.splunkAutoResolve;
} else {
eventAction = notification.splunkSeverity;
}
const options = {
method: "POST",
url: notification.splunkRestURL,
headers: { "Content-Type": "application/json" },
data: {
message_type: eventAction,
state_message: `[${title}] [${monitorUrl}] ${body}`,
entity_display_name: "Uptime Kuma Alert: " + monitorInfo.name,
routing_key: notification.pagerdutyIntegrationKey,
entity_id: "Uptime Kuma/" + monitorInfo.id,
}
};
const baseURL = await setting("primaryBaseURL");
if (baseURL && monitorInfo) {
options.client = "Uptime Kuma";
options.client_url = baseURL + getMonitorRelativeURL(monitorInfo.id);
}
let result = await axios.request(options);
this.checkResult(result);
if (result.statusText != null) {
return "Splunk notification succeed: " + result.statusText;
}
return successMessage;
}
}
module.exports = Splunk;

View File

@ -1,76 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { DOWN } = require("../../src/util");
class Squadcast extends NotificationProvider {
name = "squadcast";
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
let config = {};
let data = {
message: msg,
description: "",
tags: {},
heartbeat: heartbeatJSON,
source: "uptime-kuma"
};
if (heartbeatJSON !== null) {
data.description = heartbeatJSON["msg"];
data.event_id = heartbeatJSON["monitorID"];
if (heartbeatJSON["status"] === DOWN) {
data.message = `${monitorJSON["name"]} is DOWN`;
data.status = "trigger";
} else {
data.message = `${monitorJSON["name"]} is UP`;
data.status = "resolve";
}
let address;
switch (monitorJSON["type"]) {
case "ping":
address = monitorJSON["hostname"];
break;
case "port":
case "dns":
case "steam":
address = monitorJSON["hostname"];
if (monitorJSON["port"]) {
address += ":" + monitorJSON["port"];
}
break;
default:
address = monitorJSON["url"];
break;
}
data.tags["AlertAddress"] = address;
monitorJSON["tags"].forEach(tag => {
data.tags[tag["name"]] = {
value: tag["value"]
};
if (tag["color"] !== null) {
data.tags[tag["name"]]["color"] = tag["color"];
}
});
}
await axios.post(notification.squadcastWebhookURL, data, config);
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = Squadcast;

View File

@ -63,7 +63,7 @@ class Teams extends NotificationProvider {
}); });
} }
if (monitorUrl && monitorUrl !== "https://") { if (monitorUrl) {
facts.push({ facts.push({
name: "URL", name: "URL",
value: monitorUrl, value: monitorUrl,
@ -127,17 +127,13 @@ class Teams extends NotificationProvider {
let url; let url;
switch (monitorJSON["type"]) { if (monitorJSON["type"] === "port") {
case "http": url = monitorJSON["hostname"];
case "keywork": if (monitorJSON["port"]) {
url = monitorJSON["url"]; url += ":" + monitorJSON["port"];
break; }
case "docker": } else {
url = monitorJSON["docker_host"]; url = monitorJSON["url"];
break;
default:
url = monitorJSON["hostname"];
break;
} }
const payload = this._notificationPayloadFactory({ const payload = this._notificationPayloadFactory({

View File

@ -16,29 +16,20 @@ class Webhook extends NotificationProvider {
msg, msg,
}; };
let finalData; let finalData;
let config = { let config = {};
headers: {}
};
if (notification.webhookContentType === "form-data") { if (notification.webhookContentType === "form-data") {
finalData = new FormData(); finalData = new FormData();
finalData.append("data", JSON.stringify(data)); finalData.append("data", JSON.stringify(data));
config.headers = finalData.getHeaders();
config = {
headers: finalData.getHeaders(),
};
} else { } else {
finalData = data; finalData = data;
} }
if (notification.webhookAdditionalHeaders) {
try {
config.headers = {
...config.headers,
...JSON.parse(notification.webhookAdditionalHeaders)
};
} catch (err) {
throw "Additional Headers is not a valid JSON";
}
}
await axios.post(notification.webhookURL, finalData, config); await axios.post(notification.webhookURL, finalData, config);
return okMsg; return okMsg;

View File

@ -1,116 +0,0 @@
const NotificationProvider = require("./notification-provider");
const axios = require("axios");
const { DOWN, UP } = require("../../src/util");
class ZohoCliq extends NotificationProvider {
name = "ZohoCliq";
/**
* Generate the message to send
* @param {const} status The status constant
* @param {string} monitorName Name of monitor
* @returns {string}
*/
_statusMessageFactory = (status, monitorName) => {
if (status === DOWN) {
return `🔴 Application [${monitorName}] went down\n`;
} else if (status === UP) {
return `✅ Application [${monitorName}] is back online\n`;
}
return "Notification\n";
};
/**
* Send the notification
* @param {string} webhookUrl URL to send the request to
* @param {Array} payload Payload generated by _notificationPayloadFactory
*/
_sendNotification = async (webhookUrl, payload) => {
await axios.post(webhookUrl, { text: payload.join("\n") });
};
/**
* Generate payload for notification
* @param {const} status The status of the monitor
* @param {string} monitorMessage Message to send
* @param {string} monitorName Name of monitor affected
* @param {string} monitorUrl URL of monitor affected
* @returns {Array}
*/
_notificationPayloadFactory = ({
status,
monitorMessage,
monitorName,
monitorUrl,
}) => {
const payload = [];
payload.push("### Uptime Kuma\n");
payload.push(this._statusMessageFactory(status, monitorName));
payload.push(`*Description:* ${monitorMessage}`);
if (monitorName) {
payload.push(`*Monitor:* ${monitorName}`);
}
if (monitorUrl && monitorUrl !== "https://") {
payload.push(`*URL:* [${monitorUrl}](${monitorUrl})`);
}
return payload;
};
/**
* Send a general notification
* @param {string} webhookUrl URL to send request to
* @param {string} msg Message to send
* @returns {Promise<void>}
*/
_handleGeneralNotification = (webhookUrl, msg) => {
const payload = this._notificationPayloadFactory({
monitorMessage: msg
});
return this._sendNotification(webhookUrl, payload);
};
async send(notification, msg, monitorJSON = null, heartbeatJSON = null) {
let okMsg = "Sent Successfully.";
try {
if (heartbeatJSON == null) {
await this._handleGeneralNotification(notification.webhookUrl, msg);
return okMsg;
}
let url;
switch (monitorJSON["type"]) {
case "http":
case "keywork":
url = monitorJSON["url"];
break;
case "docker":
url = monitorJSON["docker_host"];
break;
default:
url = monitorJSON["hostname"];
break;
}
const payload = this._notificationPayloadFactory({
monitorMessage: heartbeatJSON.msg,
monitorName: monitorJSON.name,
monitorUrl: url,
status: heartbeatJSON.status
});
await this._sendNotification(notification.webhookUrl, payload);
return okMsg;
} catch (error) {
this.throwGeneralAxiosError(error);
}
}
}
module.exports = ZohoCliq;

View File

@ -1,52 +1,39 @@
const { R } = require("redbean-node"); const { R } = require("redbean-node");
const { log } = require("../src/util");
const Alerta = require("./notification-providers/alerta");
const AlertNow = require("./notification-providers/alertnow");
const AliyunSms = require("./notification-providers/aliyun-sms");
const Apprise = require("./notification-providers/apprise"); const Apprise = require("./notification-providers/apprise");
const Bark = require("./notification-providers/bark");
const ClickSendSMS = require("./notification-providers/clicksendsms");
const DingDing = require("./notification-providers/dingding");
const Discord = require("./notification-providers/discord"); const Discord = require("./notification-providers/discord");
const Feishu = require("./notification-providers/feishu");
const FreeMobile = require("./notification-providers/freemobile");
const GoogleChat = require("./notification-providers/google-chat");
const Gorush = require("./notification-providers/gorush");
const Gotify = require("./notification-providers/gotify"); const Gotify = require("./notification-providers/gotify");
const HomeAssistant = require("./notification-providers/home-assistant");
const Kook = require("./notification-providers/kook");
const Line = require("./notification-providers/line"); const Line = require("./notification-providers/line");
const LineNotify = require("./notification-providers/linenotify");
const LunaSea = require("./notification-providers/lunasea"); const LunaSea = require("./notification-providers/lunasea");
const Matrix = require("./notification-providers/matrix");
const Mattermost = require("./notification-providers/mattermost"); const Mattermost = require("./notification-providers/mattermost");
const Ntfy = require("./notification-providers/ntfy"); const Matrix = require("./notification-providers/matrix");
const Octopush = require("./notification-providers/octopush"); const Octopush = require("./notification-providers/octopush");
const OneBot = require("./notification-providers/onebot");
const PagerDuty = require("./notification-providers/pagerduty");
const PromoSMS = require("./notification-providers/promosms"); const PromoSMS = require("./notification-providers/promosms");
const ClickSendSMS = require("./notification-providers/clicksendsms");
const Pushbullet = require("./notification-providers/pushbullet"); const Pushbullet = require("./notification-providers/pushbullet");
const PushDeer = require("./notification-providers/pushdeer");
const Pushover = require("./notification-providers/pushover"); const Pushover = require("./notification-providers/pushover");
const Pushy = require("./notification-providers/pushy"); const Pushy = require("./notification-providers/pushy");
const TechulusPush = require("./notification-providers/techulus-push");
const RocketChat = require("./notification-providers/rocket-chat"); const RocketChat = require("./notification-providers/rocket-chat");
const SerwerSMS = require("./notification-providers/serwersms");
const Signal = require("./notification-providers/signal"); const Signal = require("./notification-providers/signal");
const Slack = require("./notification-providers/slack"); const Slack = require("./notification-providers/slack");
const SMSEagle = require("./notification-providers/smseagle");
const SMTP = require("./notification-providers/smtp"); const SMTP = require("./notification-providers/smtp");
const Squadcast = require("./notification-providers/squadcast");
const Stackfield = require("./notification-providers/stackfield");
const Teams = require("./notification-providers/teams"); const Teams = require("./notification-providers/teams");
const TechulusPush = require("./notification-providers/techulus-push");
const Telegram = require("./notification-providers/telegram"); const Telegram = require("./notification-providers/telegram");
const Splunk = require("./notification-providers/splunk");
const Webhook = require("./notification-providers/webhook"); const Webhook = require("./notification-providers/webhook");
const Feishu = require("./notification-providers/feishu");
const AliyunSms = require("./notification-providers/aliyun-sms");
const DingDing = require("./notification-providers/dingding");
const Bark = require("./notification-providers/bark");
const { log } = require("../src/util");
const SerwerSMS = require("./notification-providers/serwersms");
const Stackfield = require("./notification-providers/stackfield");
const WeCom = require("./notification-providers/wecom"); const WeCom = require("./notification-providers/wecom");
const GoAlert = require("./notification-providers/goalert"); const GoogleChat = require("./notification-providers/google-chat");
const SMSManager = require("./notification-providers/smsmanager"); const PagerDuty = require("./notification-providers/pagerduty");
const ServerChan = require("./notification-providers/serverchan"); const Gorush = require("./notification-providers/gorush");
const ZohoCliq = require("./notification-providers/zoho-cliq"); const Alerta = require("./notification-providers/alerta");
const OneBot = require("./notification-providers/onebot");
const PushDeer = require("./notification-providers/pushdeer");
class Notification { class Notification {
@ -59,53 +46,40 @@ class Notification {
this.providerList = {}; this.providerList = {};
const list = [ const list = [
new Alerta(),
new AlertNow(),
new AliyunSms(),
new Apprise(), new Apprise(),
new Bark(), new AliyunSms(),
new ClickSendSMS(),
new DingDing(), new DingDing(),
new Discord(), new Discord(),
new Feishu(), new Teams(),
new FreeMobile(),
new GoogleChat(),
new Gorush(),
new Gotify(), new Gotify(),
new HomeAssistant(),
new Kook(),
new Line(), new Line(),
new LineNotify(),
new LunaSea(), new LunaSea(),
new Matrix(), new Feishu(),
new Mattermost(), new Mattermost(),
new Ntfy(), new Matrix(),
new Octopush(), new Octopush(),
new OneBot(),
new PagerDuty(),
new PromoSMS(), new PromoSMS(),
new ClickSendSMS(),
new Pushbullet(), new Pushbullet(),
new PushDeer(),
new Pushover(), new Pushover(),
new Pushy(), new Pushy(),
new RocketChat(),
new ServerChan(),
new SerwerSMS(),
new Signal(),
new SMSManager(),
new Slack(),
new SMSEagle(),
new SMTP(),
new Squadcast(),
new Stackfield(),
new Teams(),
new TechulusPush(), new TechulusPush(),
new RocketChat(),
new Signal(),
new Slack(),
new SMTP(),
new Telegram(), new Telegram(),
new Splunk(),
new Webhook(), new Webhook(),
new Bark(),
new SerwerSMS(),
new Stackfield(),
new WeCom(), new WeCom(),
new GoAlert(), new GoogleChat(),
new ZohoCliq() new PagerDuty(),
new Gorush(),
new Alerta(),
new OneBot(),
new PushDeer(),
]; ];
for (let item of list) { for (let item of list) {

199
server/ping-lite.js Normal file
View File

@ -0,0 +1,199 @@
// https://github.com/ben-bradley/ping-lite/blob/master/ping-lite.js
// Fixed on Windows
const net = require("net");
const spawn = require("child_process").spawn;
const events = require("events");
const fs = require("fs");
const util = require("./util-server");
module.exports = Ping;
/**
* Constructor for ping class
* @param {string} host Host to ping
* @param {object} [options] Options for the ping command
* @param {array|string} [options.args] - Arguments to pass to the ping command
*/
function Ping(host, options) {
if (!host) {
throw new Error("You must specify a host to ping!");
}
this._host = host;
this._options = options = (options || {});
events.EventEmitter.call(this);
const timeout = 10;
if (util.WIN) {
this._bin = "c:/windows/system32/ping.exe";
this._args = (options.args) ? options.args : [ "-n", "1", "-w", timeout * 1000, host ];
this._regmatch = /[><=]([0-9.]+?)ms/;
} else if (util.LIN) {
this._bin = "/bin/ping";
const defaultArgs = [ "-n", "-w", timeout, "-c", "1", host ];
if (net.isIPv6(host) || options.ipv6) {
defaultArgs.unshift("-6");
}
this._args = (options.args) ? options.args : defaultArgs;
this._regmatch = /=([0-9.]+?) ms/;
} else if (util.MAC) {
if (net.isIPv6(host) || options.ipv6) {
this._bin = "/sbin/ping6";
} else {
this._bin = "/sbin/ping";
}
this._args = (options.args) ? options.args : [ "-n", "-t", timeout, "-c", "1", host ];
this._regmatch = /=([0-9.]+?) ms/;
} else if (util.BSD) {
this._bin = "/sbin/ping";
const defaultArgs = [ "-n", "-t", timeout, "-c", "1", host ];
if (net.isIPv6(host) || options.ipv6) {
defaultArgs.unshift("-6");
}
this._args = (options.args) ? options.args : defaultArgs;
this._regmatch = /=([0-9.]+?) ms/;
} else {
throw new Error("Could not detect your ping binary.");
}
if (!fs.existsSync(this._bin)) {
throw new Error("Could not detect " + this._bin + " on your system");
}
this._i = 0;
return this;
}
Ping.prototype.__proto__ = events.EventEmitter.prototype;
/**
* Callback for send
* @callback pingCB
* @param {any} err Any error encountered
* @param {number} ms Ping time in ms
*/
/**
* Send a ping
* @param {pingCB} callback Callback to call with results
*/
Ping.prototype.send = function (callback) {
let self = this;
callback = callback || function (err, ms) {
if (err) {
return self.emit("error", err);
}
return self.emit("result", ms);
};
let _ended;
let _exited;
let _errored;
this._ping = spawn(this._bin, this._args); // spawn the binary
this._ping.on("error", function (err) { // handle binary errors
_errored = true;
callback(err);
});
this._ping.stdout.on("data", function (data) { // log stdout
if (util.WIN) {
data = convertOutput(data);
}
this._stdout = (this._stdout || "") + data;
});
this._ping.stdout.on("end", function () {
_ended = true;
if (_exited && !_errored) {
onEnd.call(self._ping);
}
});
this._ping.stderr.on("data", function (data) { // log stderr
if (util.WIN) {
data = convertOutput(data);
}
this._stderr = (this._stderr || "") + data;
});
this._ping.on("exit", function (code) { // handle complete
_exited = true;
if (_ended && !_errored) {
onEnd.call(self._ping);
}
});
/**
* @param {Function} callback
*
* Generated by Trelent
*/
function onEnd() {
let stdout = this.stdout._stdout;
let stderr = this.stderr._stderr;
let ms;
if (stderr) {
return callback(new Error(stderr));
}
if (!stdout) {
return callback(new Error("No stdout detected"));
}
ms = stdout.match(self._regmatch); // parse out the ##ms response
ms = (ms && ms[1]) ? Number(ms[1]) : ms;
callback(null, ms, stdout);
}
};
/**
* Ping every interval
* @param {pingCB} callback Callback to call with results
*/
Ping.prototype.start = function (callback) {
let self = this;
this._i = setInterval(function () {
self.send(callback);
}, (self._options.interval || 5000));
self.send(callback);
};
/** Stop sending pings */
Ping.prototype.stop = function () {
clearInterval(this._i);
};
/**
* Try to convert to UTF-8 for Windows, as the ping's output on Windows is not UTF-8 and could be in other languages
* Thank @pemassi
* https://github.com/louislam/uptime-kuma/issues/570#issuecomment-941984094
* @param {any} data
* @returns {string}
*/
function convertOutput(data) {
if (util.WIN) {
if (data) {
return util.convertToUTF8(data);
}
}
return data;
}

View File

@ -99,7 +99,6 @@ class Prometheus {
} }
} }
/** Remove monitor from prometheus */
remove() { remove() {
try { try {
monitorCertDaysRemaining.remove(this.monitorLabelValues); monitorCertDaysRemaining.remove(this.monitorLabelValues);

View File

@ -7,7 +7,7 @@ const { UptimeKumaServer } = require("./uptime-kuma-server");
class Proxy { class Proxy {
static SUPPORTED_PROXY_PROTOCOLS = [ "http", "https", "socks", "socks5", "socks5h", "socks4" ]; static SUPPORTED_PROXY_PROTOCOLS = [ "http", "https", "socks", "socks5", "socks4" ];
/** /**
* Saves and updates given proxy entity * Saves and updates given proxy entity
@ -126,7 +126,6 @@ class Proxy {
break; break;
case "socks": case "socks":
case "socks5": case "socks5":
case "socks5h":
case "socks4": case "socks4":
agent = new SocksProxyAgent({ agent = new SocksProxyAgent({
...httpAgentOptions, ...httpAgentOptions,

View File

@ -1,10 +1,10 @@
let express = require("express"); let express = require("express");
const { allowDevAllOrigin, allowAllOrigin, percentageToColor, filterAndJoin, send403 } = require("../util-server"); const { allowDevAllOrigin, allowAllOrigin, percentageToColor, filterAndJoin } = require("../util-server");
const { R } = require("redbean-node"); const { R } = require("redbean-node");
const apicache = require("../modules/apicache"); const apicache = require("../modules/apicache");
const Monitor = require("../model/monitor"); const Monitor = require("../model/monitor");
const dayjs = require("dayjs"); const dayjs = require("dayjs");
const { UP, MAINTENANCE, DOWN, flipStatus, log } = require("../../src/util"); const { UP, DOWN, flipStatus, log } = require("../../src/util");
const StatusPage = require("../model/status_page"); const StatusPage = require("../model/status_page");
const { UptimeKumaServer } = require("../uptime-kuma-server"); const { UptimeKumaServer } = require("../uptime-kuma-server");
const { makeBadge } = require("badge-maker"); const { makeBadge } = require("badge-maker");
@ -59,7 +59,7 @@ router.get("/api/push/:pushToken", async (request, response) => {
let duration = 0; let duration = 0;
let bean = R.dispense("heartbeat"); let bean = R.dispense("heartbeat");
bean.time = R.isoDateTimeMillis(dayjs.utc()); bean.time = R.isoDateTime(dayjs.utc());
if (previousHeartbeat) { if (previousHeartbeat) {
isFirstBeat = false; isFirstBeat = false;
@ -67,12 +67,6 @@ router.get("/api/push/:pushToken", async (request, response) => {
duration = dayjs(bean.time).diff(dayjs(previousHeartbeat.time), "second"); duration = dayjs(bean.time).diff(dayjs(previousHeartbeat.time), "second");
} }
if (await Monitor.isUnderMaintenance(monitor.id)) {
msg = "Monitor under maintenance";
status = MAINTENANCE;
}
log.debug("router", `/api/push/ called at ${dayjs().format("YYYY-MM-DD HH:mm:ss.SSS")}`);
log.debug("router", "PreviousStatus: " + previousStatus); log.debug("router", "PreviousStatus: " + previousStatus);
log.debug("router", "Current Status: " + status); log.debug("router", "Current Status: " + status);
@ -92,18 +86,120 @@ router.get("/api/push/:pushToken", async (request, response) => {
ok: true, ok: true,
}); });
if (Monitor.isImportantForNotification(isFirstBeat, previousStatus, status)) { if (bean.important) {
await Monitor.sendNotification(isFirstBeat, monitor, bean); await Monitor.sendNotification(isFirstBeat, monitor, bean);
} }
} catch (e) { } catch (e) {
response.status(404).json({ response.json({
ok: false, ok: false,
msg: e.message msg: e.message
}); });
} }
}); });
// Status page config, incident, monitor list
router.get("/api/status-page/:slug", cache("5 minutes"), async (request, response) => {
allowDevAllOrigin(response);
let slug = request.params.slug;
// Get Status Page
let statusPage = await R.findOne("status_page", " slug = ? ", [
slug
]);
if (!statusPage) {
response.statusCode = 404;
response.json({
msg: "Not Found"
});
return;
}
try {
// Incident
let incident = await R.findOne("incident", " pin = 1 AND active = 1 AND status_page_id = ? ", [
statusPage.id,
]);
if (incident) {
incident = incident.toPublicJSON();
}
// Public Group List
const publicGroupList = [];
const showTags = !!statusPage.show_tags;
const list = await R.find("group", " public = 1 AND status_page_id = ? ORDER BY weight ", [
statusPage.id
]);
for (let groupBean of list) {
let monitorGroup = await groupBean.toPublicJSON(showTags);
publicGroupList.push(monitorGroup);
}
// Response
response.json({
config: await statusPage.toPublicJSON(),
incident,
publicGroupList
});
} catch (error) {
send403(response, error.message);
}
});
// Status Page Polling Data
// Can fetch only if published
router.get("/api/status-page/heartbeat/:slug", cache("1 minutes"), async (request, response) => {
allowDevAllOrigin(response);
try {
let heartbeatList = {};
let uptimeList = {};
let slug = request.params.slug;
let statusPageID = await StatusPage.slugToID(slug);
let monitorIDList = await R.getCol(`
SELECT monitor_group.monitor_id FROM monitor_group, \`group\`
WHERE monitor_group.group_id = \`group\`.id
AND public = 1
AND \`group\`.status_page_id = ?
`, [
statusPageID
]);
for (let monitorID of monitorIDList) {
let list = await R.getAll(`
SELECT * FROM heartbeat
WHERE monitor_id = ?
ORDER BY time DESC
LIMIT 50
`, [
monitorID,
]);
list = R.convertToBeans("heartbeat", list);
heartbeatList[monitorID] = list.reverse().map(row => row.toPublicJSON());
const type = 24;
uptimeList[`${monitorID}_${type}`] = await Monitor.calcUptime(type, monitorID);
}
response.json({
heartbeatList,
uptimeList
});
} catch (error) {
send403(response, error.message);
}
});
router.get("/api/badge/:id/status", cache("5 minutes"), async (request, response) => { router.get("/api/badge/:id/status", cache("5 minutes"), async (request, response) => {
allowAllOrigin(response); allowAllOrigin(response);
@ -141,7 +237,6 @@ router.get("/api/badge/:id/status", cache("5 minutes"), async (request, response
const heartbeat = await Monitor.getPreviousHeartbeat(requestedMonitorId); const heartbeat = await Monitor.getPreviousHeartbeat(requestedMonitorId);
const state = overrideValue !== undefined ? overrideValue : heartbeat.status === 1; const state = overrideValue !== undefined ? overrideValue : heartbeat.status === 1;
badgeValues.label = label ? label : "";
badgeValues.color = state ? upColor : downColor; badgeValues.color = state ? upColor : downColor;
badgeValues.message = label ?? state ? upLabel : downLabel; badgeValues.message = label ?? state ? upLabel : downLabel;
} }
@ -281,4 +376,16 @@ router.get("/api/badge/:id/ping/:duration?", cache("5 minutes"), async (request,
} }
}); });
/**
* Send a 403 response
* @param {Object} res Express response object
* @param {string} [msg=""] Message to send
*/
function send403(res, msg = "") {
res.status(403).json({
"status": "fail",
"msg": msg,
});
}
module.exports = router; module.exports = router;

View File

@ -1,148 +0,0 @@
let express = require("express");
const apicache = require("../modules/apicache");
const { UptimeKumaServer } = require("../uptime-kuma-server");
const StatusPage = require("../model/status_page");
const { allowDevAllOrigin, send403 } = require("../util-server");
const { R } = require("redbean-node");
const Monitor = require("../model/monitor");
let router = express.Router();
let cache = apicache.middleware;
const server = UptimeKumaServer.getInstance();
router.get("/status/:slug", cache("5 minutes"), async (request, response) => {
let slug = request.params.slug;
await StatusPage.handleStatusPageResponse(response, server.indexHTML, slug);
});
router.get("/status", cache("5 minutes"), async (request, response) => {
let slug = "default";
await StatusPage.handleStatusPageResponse(response, server.indexHTML, slug);
});
router.get("/status-page", cache("5 minutes"), async (request, response) => {
let slug = "default";
await StatusPage.handleStatusPageResponse(response, server.indexHTML, slug);
});
// Status page config, incident, monitor list
router.get("/api/status-page/:slug", cache("5 minutes"), async (request, response) => {
allowDevAllOrigin(response);
let slug = request.params.slug;
try {
// Get Status Page
let statusPage = await R.findOne("status_page", " slug = ? ", [
slug
]);
if (!statusPage) {
return null;
}
let statusPageData = await StatusPage.getStatusPageData(statusPage);
if (!statusPageData) {
response.statusCode = 404;
response.json({
msg: "Not Found"
});
return;
}
// Response
response.json(statusPageData);
} catch (error) {
send403(response, error.message);
}
});
// Status Page Polling Data
// Can fetch only if published
router.get("/api/status-page/heartbeat/:slug", cache("1 minutes"), async (request, response) => {
allowDevAllOrigin(response);
try {
let heartbeatList = {};
let uptimeList = {};
let slug = request.params.slug;
let statusPageID = await StatusPage.slugToID(slug);
let monitorIDList = await R.getCol(`
SELECT monitor_group.monitor_id FROM monitor_group, \`group\`
WHERE monitor_group.group_id = \`group\`.id
AND public = 1
AND \`group\`.status_page_id = ?
`, [
statusPageID
]);
for (let monitorID of monitorIDList) {
let list = await R.getAll(`
SELECT * FROM heartbeat
WHERE monitor_id = ?
ORDER BY time DESC
LIMIT 50
`, [
monitorID,
]);
list = R.convertToBeans("heartbeat", list);
heartbeatList[monitorID] = list.reverse().map(row => row.toPublicJSON());
const type = 24;
uptimeList[`${monitorID}_${type}`] = await Monitor.calcUptime(type, monitorID);
}
response.json({
heartbeatList,
uptimeList
});
} catch (error) {
send403(response, error.message);
}
});
// Status page's manifest.json
router.get("/api/status-page/:slug/manifest.json", cache("1440 minutes"), async (request, response) => {
allowDevAllOrigin(response);
let slug = request.params.slug;
try {
// Get Status Page
let statusPage = await R.findOne("status_page", " slug = ? ", [
slug
]);
if (!statusPage) {
response.statusCode = 404;
response.json({
msg: "Not Found"
});
return;
}
// Response
response.json({
"name": statusPage.title,
"start_url": "/status/" + statusPage.slug,
"display": "standalone",
"icons": [
{
"src": statusPage.icon,
"sizes": "128x128",
"type": "image/png"
}
]
});
} catch (error) {
send403(response, error.message);
}
});
module.exports = router;

View File

@ -5,12 +5,6 @@
*/ */
console.log("Welcome to Uptime Kuma"); console.log("Welcome to Uptime Kuma");
// As the log function need to use dayjs, it should be very top
const dayjs = require("dayjs");
dayjs.extend(require("dayjs/plugin/utc"));
dayjs.extend(require("./modules/dayjs/plugin/timezone"));
dayjs.extend(require("dayjs/plugin/customParseFormat"));
// Check Node.js Version // Check Node.js Version
const nodeVersion = parseInt(process.versions.node.split(".")[0]); const nodeVersion = parseInt(process.versions.node.split(".")[0]);
const requiredVersion = 14; const requiredVersion = 14;
@ -22,7 +16,7 @@ if (nodeVersion < requiredVersion) {
} }
const args = require("args-parser")(process.argv); const args = require("args-parser")(process.argv);
const { sleep, log, getRandomInt, genSecret, isDev } = require("../src/util"); const { sleep, log, getRandomInt, genSecret, debug, isDev } = require("../src/util");
const config = require("./config"); const config = require("./config");
log.info("server", "Welcome to Uptime Kuma"); log.info("server", "Welcome to Uptime Kuma");
@ -39,10 +33,8 @@ log.info("server", "Importing Node libraries");
const fs = require("fs"); const fs = require("fs");
log.info("server", "Importing 3rd-party libraries"); log.info("server", "Importing 3rd-party libraries");
log.debug("server", "Importing express"); log.debug("server", "Importing express");
const express = require("express"); const express = require("express");
const expressStaticGzip = require("express-static-gzip");
log.debug("server", "Importing redbean-node"); log.debug("server", "Importing redbean-node");
const { R } = require("redbean-node"); const { R } = require("redbean-node");
log.debug("server", "Importing jsonwebtoken"); log.debug("server", "Importing jsonwebtoken");
@ -68,7 +60,7 @@ log.info("server", "Importing this project modules");
log.debug("server", "Importing Monitor"); log.debug("server", "Importing Monitor");
const Monitor = require("./model/monitor"); const Monitor = require("./model/monitor");
log.debug("server", "Importing Settings"); log.debug("server", "Importing Settings");
const { getSettings, setSettings, setting, initJWTSecret, checkLogin, startUnitTest, FBSD, doubleCheckPassword, startE2eTests } = require("./util-server"); const { getSettings, setSettings, setting, initJWTSecret, checkLogin, startUnitTest, FBSD, doubleCheckPassword } = require("./util-server");
log.debug("server", "Importing Notification"); log.debug("server", "Importing Notification");
const { Notification } = require("./notification"); const { Notification } = require("./notification");
@ -119,25 +111,19 @@ const twoFAVerifyOptions = {
* @type {boolean} * @type {boolean}
*/ */
const testMode = !!args["test"] || false; const testMode = !!args["test"] || false;
const e2eTestMode = !!args["e2e"] || false;
if (config.demoMode) { if (config.demoMode) {
log.info("server", "==== Demo Mode ===="); log.info("server", "==== Demo Mode ====");
} }
// Must be after io instantiation // Must be after io instantiation
const { sendNotificationList, sendHeartbeatList, sendImportantHeartbeatList, sendInfo, sendProxyList, sendDockerHostList } = require("./client"); const { sendNotificationList, sendHeartbeatList, sendImportantHeartbeatList, sendInfo, sendProxyList } = require("./client");
const { statusPageSocketHandler } = require("./socket-handlers/status-page-socket-handler"); const { statusPageSocketHandler } = require("./socket-handlers/status-page-socket-handler");
const databaseSocketHandler = require("./socket-handlers/database-socket-handler"); const databaseSocketHandler = require("./socket-handlers/database-socket-handler");
const TwoFA = require("./2fa"); const TwoFA = require("./2fa");
const StatusPage = require("./model/status_page"); const StatusPage = require("./model/status_page");
const { cloudflaredSocketHandler, autoStart: cloudflaredAutoStart, stop: cloudflaredStop } = require("./socket-handlers/cloudflared-socket-handler"); const { cloudflaredSocketHandler, autoStart: cloudflaredAutoStart, stop: cloudflaredStop } = require("./socket-handlers/cloudflared-socket-handler");
const { proxySocketHandler } = require("./socket-handlers/proxy-socket-handler"); const { proxySocketHandler } = require("./socket-handlers/proxy-socket-handler");
const { dockerSocketHandler } = require("./socket-handlers/docker-socket-handler");
const { maintenanceSocketHandler } = require("./socket-handlers/maintenance-socket-handler");
const { generalSocketHandler } = require("./socket-handlers/general-socket-handler");
const { Settings } = require("./settings");
const { CacheableDnsHttpAgent } = require("./cacheable-dns-http-agent");
app.use(express.json()); app.use(express.json());
@ -162,12 +148,27 @@ let jwtSecret = null;
*/ */
let needSetup = false; let needSetup = false;
/**
* Cache Index HTML
* @type {string}
*/
let indexHTML = "";
try {
indexHTML = fs.readFileSync("./dist/index.html").toString();
} catch (e) {
// "dist/index.html" is not necessary for development
if (process.env.NODE_ENV !== "development") {
log.error("server", "Error: Cannot find 'dist/index.html', did you install correctly?");
process.exit(1);
}
}
(async () => { (async () => {
Database.init(args); Database.init(args);
await initDatabase(testMode); await initDatabase(testMode);
await server.initAfterDatabaseReady();
server.entryPage = await Settings.get("entryPage"); exports.entryPage = await setting("entryPage");
await StatusPage.loadDomainMappingList(); await StatusPage.loadDomainMappingList();
log.info("server", "Adding route"); log.info("server", "Adding route");
@ -178,26 +179,13 @@ let needSetup = false;
// Entry Page // Entry Page
app.get("/", async (request, response) => { app.get("/", async (request, response) => {
let hostname = request.hostname; debug(`Request Domain: ${request.hostname}`);
if (await setting("trustProxy")) {
const proxy = request.headers["x-forwarded-host"];
if (proxy) {
hostname = proxy;
}
}
log.debug("entry", `Request Domain: ${hostname}`);
const uptimeKumaEntryPage = server.entryPage;
if (hostname in StatusPage.domainMappingList) {
log.debug("entry", "This is a status page domain");
let slug = StatusPage.domainMappingList[hostname];
await StatusPage.handleStatusPageResponse(response, server.indexHTML, slug);
} else if (uptimeKumaEntryPage && uptimeKumaEntryPage.startsWith("statusPage-")) {
response.redirect("/status/" + uptimeKumaEntryPage.replace("statusPage-", ""));
if (request.hostname in StatusPage.domainMappingList) {
debug("This is a status page domain");
response.send(indexHTML);
} else if (exports.entryPage && exports.entryPage.startsWith("statusPage-")) {
response.redirect("/status/" + exports.entryPage.replace("statusPage-", ""));
} else { } else {
response.redirect("/dashboard"); response.redirect("/dashboard");
} }
@ -205,7 +193,6 @@ let needSetup = false;
if (isDev) { if (isDev) {
app.post("/test-webhook", async (request, response) => { app.post("/test-webhook", async (request, response) => {
log.debug("test", request.headers);
log.debug("test", request.body); log.debug("test", request.body);
response.send("OK"); response.send("OK");
}); });
@ -214,7 +201,7 @@ let needSetup = false;
// Robots.txt // Robots.txt
app.get("/robots.txt", async (_request, response) => { app.get("/robots.txt", async (_request, response) => {
let txt = "User-agent: *\nDisallow:"; let txt = "User-agent: *\nDisallow:";
if (!await setting("searchEngineIndex")) { if (! await setting("searchEngineIndex")) {
txt += " /"; txt += " /";
} }
response.setHeader("Content-Type", "text/plain"); response.setHeader("Content-Type", "text/plain");
@ -227,9 +214,7 @@ let needSetup = false;
// With Basic Auth using the first user's username/password // With Basic Auth using the first user's username/password
app.get("/metrics", basicAuth, prometheusAPIMetrics()); app.get("/metrics", basicAuth, prometheusAPIMetrics());
app.use("/", expressStaticGzip("dist", { app.use("/", express.static("dist"));
enableBrotli: true,
}));
// ./data/upload // ./data/upload
app.use("/upload", express.static(Database.uploadDir)); app.use("/upload", express.static(Database.uploadDir));
@ -242,16 +227,12 @@ let needSetup = false;
const apiRouter = require("./routers/api-router"); const apiRouter = require("./routers/api-router");
app.use(apiRouter); app.use(apiRouter);
// Status Page Router
const statusPageRouter = require("./routers/status-page-router");
app.use(statusPageRouter);
// Universal Route Handler, must be at the end of all express routes. // Universal Route Handler, must be at the end of all express routes.
app.get("*", async (_request, response) => { app.get("*", async (_request, response) => {
if (_request.originalUrl.startsWith("/upload/")) { if (_request.originalUrl.startsWith("/upload/")) {
response.status(404).send("File not found."); response.status(404).send("File not found.");
} else { } else {
response.send(server.indexHTML); response.send(indexHTML);
} }
}); });
@ -270,9 +251,7 @@ let needSetup = false;
// *************************** // ***************************
socket.on("loginByToken", async (token, callback) => { socket.on("loginByToken", async (token, callback) => {
const clientIP = await server.getClientIP(socket); log.info("auth", `Login by token. IP=${getClientIp(socket)}`);
log.info("auth", `Login by token. IP=${clientIP}`);
try { try {
let decoded = jwt.verify(token, jwtSecret); let decoded = jwt.verify(token, jwtSecret);
@ -288,14 +267,14 @@ let needSetup = false;
afterLogin(socket, user); afterLogin(socket, user);
log.debug("auth", "afterLogin ok"); log.debug("auth", "afterLogin ok");
log.info("auth", `Successfully logged in user ${decoded.username}. IP=${clientIP}`); log.info("auth", `Successfully logged in user ${decoded.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: true, ok: true,
}); });
} else { } else {
log.info("auth", `Inactive or deleted user ${decoded.username}. IP=${clientIP}`); log.info("auth", `Inactive or deleted user ${decoded.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -304,7 +283,7 @@ let needSetup = false;
} }
} catch (error) { } catch (error) {
log.error("auth", `Invalid token. IP=${clientIP}`); log.error("auth", `Invalid token. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -315,9 +294,7 @@ let needSetup = false;
}); });
socket.on("login", async (data, callback) => { socket.on("login", async (data, callback) => {
const clientIP = await server.getClientIP(socket); log.info("auth", `Login by username + password. IP=${getClientIp(socket)}`);
log.info("auth", `Login by username + password. IP=${clientIP}`);
// Checking // Checking
if (typeof callback !== "function") { if (typeof callback !== "function") {
@ -330,7 +307,7 @@ let needSetup = false;
// Login Rate Limit // Login Rate Limit
if (! await loginRateLimiter.pass(callback)) { if (! await loginRateLimiter.pass(callback)) {
log.info("auth", `Too many failed requests for user ${data.username}. IP=${clientIP}`); log.info("auth", `Too many failed requests for user ${data.username}. IP=${getClientIp(socket)}`);
return; return;
} }
@ -340,7 +317,7 @@ let needSetup = false;
if (user.twofa_status === 0) { if (user.twofa_status === 0) {
afterLogin(socket, user); afterLogin(socket, user);
log.info("auth", `Successfully logged in user ${data.username}. IP=${clientIP}`); log.info("auth", `Successfully logged in user ${data.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: true, ok: true,
@ -352,7 +329,7 @@ let needSetup = false;
if (user.twofa_status === 1 && !data.token) { if (user.twofa_status === 1 && !data.token) {
log.info("auth", `2FA token required for user ${data.username}. IP=${clientIP}`); log.info("auth", `2FA token required for user ${data.username}. IP=${getClientIp(socket)}`);
callback({ callback({
tokenRequired: true, tokenRequired: true,
@ -370,7 +347,7 @@ let needSetup = false;
socket.userID, socket.userID,
]); ]);
log.info("auth", `Successfully logged in user ${data.username}. IP=${clientIP}`); log.info("auth", `Successfully logged in user ${data.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: true, ok: true,
@ -380,7 +357,7 @@ let needSetup = false;
}); });
} else { } else {
log.warn("auth", `Invalid token provided for user ${data.username}. IP=${clientIP}`); log.warn("auth", `Invalid token provided for user ${data.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -390,7 +367,7 @@ let needSetup = false;
} }
} else { } else {
log.warn("auth", `Incorrect username or password for user ${data.username}. IP=${clientIP}`); log.warn("auth", `Incorrect username or password for user ${data.username}. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -462,8 +439,6 @@ let needSetup = false;
}); });
socket.on("save2FA", async (currentPassword, callback) => { socket.on("save2FA", async (currentPassword, callback) => {
const clientIP = await server.getClientIP(socket);
try { try {
if (! await twoFaRateLimiter.pass(callback)) { if (! await twoFaRateLimiter.pass(callback)) {
return; return;
@ -476,7 +451,7 @@ let needSetup = false;
socket.userID, socket.userID,
]); ]);
log.info("auth", `Saved 2FA token. IP=${clientIP}`); log.info("auth", `Saved 2FA token. IP=${getClientIp(socket)}`);
callback({ callback({
ok: true, ok: true,
@ -484,7 +459,7 @@ let needSetup = false;
}); });
} catch (error) { } catch (error) {
log.error("auth", `Error changing 2FA token. IP=${clientIP}`); log.error("auth", `Error changing 2FA token. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -494,8 +469,6 @@ let needSetup = false;
}); });
socket.on("disable2FA", async (currentPassword, callback) => { socket.on("disable2FA", async (currentPassword, callback) => {
const clientIP = await server.getClientIP(socket);
try { try {
if (! await twoFaRateLimiter.pass(callback)) { if (! await twoFaRateLimiter.pass(callback)) {
return; return;
@ -505,7 +478,7 @@ let needSetup = false;
await doubleCheckPassword(socket, currentPassword); await doubleCheckPassword(socket, currentPassword);
await TwoFA.disable2FA(socket.userID); await TwoFA.disable2FA(socket.userID);
log.info("auth", `Disabled 2FA token. IP=${clientIP}`); log.info("auth", `Disabled 2FA token. IP=${getClientIp(socket)}`);
callback({ callback({
ok: true, ok: true,
@ -513,7 +486,7 @@ let needSetup = false;
}); });
} catch (error) { } catch (error) {
log.error("auth", `Error disabling 2FA token. IP=${clientIP}`); log.error("auth", `Error disabling 2FA token. IP=${getClientIp(socket)}`);
callback({ callback({
ok: false, ok: false,
@ -634,9 +607,6 @@ let needSetup = false;
bean.import(monitor); bean.import(monitor);
bean.user_id = socket.userID; bean.user_id = socket.userID;
bean.validate();
await R.store(bean); await R.store(bean);
await updateMonitorNotification(bean.id, notificationIDList); await updateMonitorNotification(bean.id, notificationIDList);
@ -687,10 +657,9 @@ let needSetup = false;
bean.basic_auth_pass = monitor.basic_auth_pass; bean.basic_auth_pass = monitor.basic_auth_pass;
bean.interval = monitor.interval; bean.interval = monitor.interval;
bean.retryInterval = monitor.retryInterval; bean.retryInterval = monitor.retryInterval;
bean.resendInterval = monitor.resendInterval;
bean.hostname = monitor.hostname; bean.hostname = monitor.hostname;
bean.maxretries = monitor.maxretries; bean.maxretries = monitor.maxretries;
bean.port = parseInt(monitor.port); bean.port = monitor.port;
bean.keyword = monitor.keyword; bean.keyword = monitor.keyword;
bean.ignoreTls = monitor.ignoreTls; bean.ignoreTls = monitor.ignoreTls;
bean.expiryNotification = monitor.expiryNotification; bean.expiryNotification = monitor.expiryNotification;
@ -700,32 +669,11 @@ let needSetup = false;
bean.dns_resolve_type = monitor.dns_resolve_type; bean.dns_resolve_type = monitor.dns_resolve_type;
bean.dns_resolve_server = monitor.dns_resolve_server; bean.dns_resolve_server = monitor.dns_resolve_server;
bean.pushToken = monitor.pushToken; bean.pushToken = monitor.pushToken;
bean.docker_container = monitor.docker_container;
bean.docker_host = monitor.docker_host;
bean.proxyId = Number.isInteger(monitor.proxyId) ? monitor.proxyId : null; bean.proxyId = Number.isInteger(monitor.proxyId) ? monitor.proxyId : null;
bean.mqttUsername = monitor.mqttUsername; bean.mqttUsername = monitor.mqttUsername;
bean.mqttPassword = monitor.mqttPassword; bean.mqttPassword = monitor.mqttPassword;
bean.mqttTopic = monitor.mqttTopic; bean.mqttTopic = monitor.mqttTopic;
bean.mqttSuccessMessage = monitor.mqttSuccessMessage; bean.mqttSuccessMessage = monitor.mqttSuccessMessage;
bean.databaseConnectionString = monitor.databaseConnectionString;
bean.databaseQuery = monitor.databaseQuery;
bean.authMethod = monitor.authMethod;
bean.authWorkstation = monitor.authWorkstation;
bean.authDomain = monitor.authDomain;
bean.grpcUrl = monitor.grpcUrl;
bean.grpcProtobuf = monitor.grpcProtobuf;
bean.grpcServiceName = monitor.grpcServiceName;
bean.grpcMethod = monitor.grpcMethod;
bean.grpcBody = monitor.grpcBody;
bean.grpcMetadata = monitor.grpcMetadata;
bean.grpcEnableTls = monitor.grpcEnableTls;
bean.radiusUsername = monitor.radiusUsername;
bean.radiusPassword = monitor.radiusPassword;
bean.radiusCalledStationId = monitor.radiusCalledStationId;
bean.radiusCallingStationId = monitor.radiusCallingStationId;
bean.radiusSecret = monitor.radiusSecret;
bean.validate();
await R.store(bean); await R.store(bean);
@ -941,21 +889,13 @@ let needSetup = false;
try { try {
checkLogin(socket); checkLogin(socket);
let bean = await R.findOne("tag", " id = ? ", [ tag.id ]); let bean = await R.findOne("monitor", " id = ? ", [ tag.id ]);
if (bean == null) {
callback({
ok: false,
msg: "Tag not found",
});
return;
}
bean.name = tag.name; bean.name = tag.name;
bean.color = tag.color; bean.color = tag.color;
await R.store(bean); await R.store(bean);
callback({ callback({
ok: true, ok: true,
msg: "Saved",
tag: await bean.toJSON(), tag: await bean.toJSON(),
}); });
@ -1089,15 +1029,10 @@ let needSetup = false;
socket.on("getSettings", async (callback) => { socket.on("getSettings", async (callback) => {
try { try {
checkLogin(socket); checkLogin(socket);
const data = await getSettings("general");
if (!data.serverTimezone) {
data.serverTimezone = await server.getTimezone();
}
callback({ callback({
ok: true, ok: true,
data: data, data: await getSettings("general"),
}); });
} catch (e) { } catch (e) {
@ -1123,14 +1058,7 @@ let needSetup = false;
} }
await setSettings("general", data); await setSettings("general", data);
server.entryPage = data.entryPage; exports.entryPage = data.entryPage;
await CacheableDnsHttpAgent.update();
// Also need to apply timezone globally
if (data.serverTimezone) {
await server.setTimezone(data.serverTimezone);
}
callback({ callback({
ok: true, ok: true,
@ -1138,7 +1066,6 @@ let needSetup = false;
}); });
sendInfo(socket); sendInfo(socket);
server.sendMaintenanceList(socket);
} catch (e) { } catch (e) {
callback({ callback({
@ -1320,14 +1247,10 @@ let needSetup = false;
method: monitorListData[i].method || "GET", method: monitorListData[i].method || "GET",
body: monitorListData[i].body, body: monitorListData[i].body,
headers: monitorListData[i].headers, headers: monitorListData[i].headers,
authMethod: monitorListData[i].authMethod,
basic_auth_user: monitorListData[i].basic_auth_user, basic_auth_user: monitorListData[i].basic_auth_user,
basic_auth_pass: monitorListData[i].basic_auth_pass, basic_auth_pass: monitorListData[i].basic_auth_pass,
authWorkstation: monitorListData[i].authWorkstation,
authDomain: monitorListData[i].authDomain,
interval: monitorListData[i].interval, interval: monitorListData[i].interval,
retryInterval: retryInterval, retryInterval: retryInterval,
resendInterval: monitorListData[i].resendInterval || 0,
hostname: monitorListData[i].hostname, hostname: monitorListData[i].hostname,
maxretries: monitorListData[i].maxretries, maxretries: monitorListData[i].maxretries,
port: monitorListData[i].port, port: monitorListData[i].port,
@ -1496,9 +1419,6 @@ let needSetup = false;
cloudflaredSocketHandler(socket); cloudflaredSocketHandler(socket);
databaseSocketHandler(socket); databaseSocketHandler(socket);
proxySocketHandler(socket); proxySocketHandler(socket);
dockerSocketHandler(socket);
maintenanceSocketHandler(socket);
generalSocketHandler(socket, server);
log.debug("server", "added all socket handlers"); log.debug("server", "added all socket handlers");
@ -1536,10 +1456,6 @@ let needSetup = false;
if (testMode) { if (testMode) {
startUnitTest(); startUnitTest();
} }
if (e2eTestMode) {
startE2eTests();
}
}); });
initBackgroundJobs(args); initBackgroundJobs(args);
@ -1601,10 +1517,8 @@ async function afterLogin(socket, user) {
socket.join(user.id); socket.join(user.id);
let monitorList = await server.sendMonitorList(socket); let monitorList = await server.sendMonitorList(socket);
server.sendMaintenanceList(socket);
sendNotificationList(socket); sendNotificationList(socket);
sendProxyList(socket); sendProxyList(socket);
sendDockerHostList(socket);
await sleep(500); await sleep(500);
@ -1621,13 +1535,6 @@ async function afterLogin(socket, user) {
for (let monitorID in monitorList) { for (let monitorID in monitorList) {
await Monitor.sendStats(io, monitorID, user.id); await Monitor.sendStats(io, monitorID, user.id);
} }
// Set server timezone from client browser if not set
// It should be run once only
if (! await Settings.get("initServerTimezone")) {
log.debug("server", "emit initServerTimezone");
socket.emit("initServerTimezone");
}
} }
/** /**
@ -1754,8 +1661,6 @@ async function shutdownFunction(signal) {
log.info("server", "Shutdown requested"); log.info("server", "Shutdown requested");
log.info("server", "Called signal: " + signal); log.info("server", "Called signal: " + signal);
await server.stop();
log.info("server", "Stopping all monitors"); log.info("server", "Stopping all monitors");
for (let id in server.monitorList) { for (let id in server.monitorList) {
let monitor = server.monitorList[id]; let monitor = server.monitorList[id];
@ -1766,7 +1671,10 @@ async function shutdownFunction(signal) {
stopBackgroundJobs(); stopBackgroundJobs();
await cloudflaredStop(); await cloudflaredStop();
Settings.stopCacheCleaner(); }
function getClientIp(socket) {
return socket.client.conn.remoteAddress.replace(/^.*:/, "");
} }
/** Final function called before application exits */ /** Final function called before application exits */

View File

@ -1,172 +0,0 @@
const { R } = require("redbean-node");
const { log } = require("../src/util");
class Settings {
/**
* Example:
* {
* key1: {
* value: "value2",
* timestamp: 12345678
* },
* key2: {
* value: 2,
* timestamp: 12345678
* },
* }
* @type {{}}
*/
static cacheList = {
};
static cacheCleaner = null;
/**
* Retrieve value of setting based on key
* @param {string} key Key of setting to retrieve
* @returns {Promise<any>} Value
*/
static async get(key) {
// Start cache clear if not started yet
if (!Settings.cacheCleaner) {
Settings.cacheCleaner = setInterval(() => {
log.debug("settings", "Cache Cleaner is just started.");
for (key in Settings.cacheList) {
if (Date.now() - Settings.cacheList[key].timestamp > 60 * 1000) {
log.debug("settings", "Cache Cleaner deleted: " + key);
delete Settings.cacheList[key];
}
}
}, 60 * 1000);
}
// Query from cache
if (key in Settings.cacheList) {
const v = Settings.cacheList[key].value;
log.debug("settings", `Get Setting (cache): ${key}: ${v}`);
return v;
}
let value = await R.getCell("SELECT `value` FROM setting WHERE `key` = ? ", [
key,
]);
try {
const v = JSON.parse(value);
log.debug("settings", `Get Setting: ${key}: ${v}`);
Settings.cacheList[key] = {
value: v,
timestamp: Date.now()
};
return v;
} catch (e) {
return value;
}
}
/**
* Sets the specified setting to specified value
* @param {string} key Key of setting to set
* @param {any} value Value to set to
* @param {?string} type Type of setting
* @returns {Promise<void>}
*/
static async set(key, value, type = null) {
let bean = await R.findOne("setting", " `key` = ? ", [
key,
]);
if (!bean) {
bean = R.dispense("setting");
bean.key = key;
}
bean.type = type;
bean.value = JSON.stringify(value);
await R.store(bean);
Settings.deleteCache([ key ]);
}
/**
* Get settings based on type
* @param {string} type The type of setting
* @returns {Promise<Bean>}
*/
static async getSettings(type) {
let list = await R.getAll("SELECT `key`, `value` FROM setting WHERE `type` = ? ", [
type,
]);
let result = {};
for (let row of list) {
try {
result[row.key] = JSON.parse(row.value);
} catch (e) {
result[row.key] = row.value;
}
}
return result;
}
/**
* Set settings based on type
* @param {string} type Type of settings to set
* @param {Object} data Values of settings
* @returns {Promise<void>}
*/
static async setSettings(type, data) {
let keyList = Object.keys(data);
let promiseList = [];
for (let key of keyList) {
let bean = await R.findOne("setting", " `key` = ? ", [
key
]);
if (bean == null) {
bean = R.dispense("setting");
bean.type = type;
bean.key = key;
}
if (bean.type === type) {
bean.value = JSON.stringify(data[key]);
promiseList.push(R.store(bean));
}
}
await Promise.all(promiseList);
Settings.deleteCache(keyList);
}
/**
*
* @param {string[]} keyList
*/
static deleteCache(keyList) {
for (let key of keyList) {
delete Settings.cacheList[key];
}
}
static stopCacheCleaner() {
if (Settings.cacheCleaner) {
clearInterval(Settings.cacheCleaner);
Settings.cacheCleaner = null;
}
}
}
module.exports = {
Settings,
};

View File

@ -1,7 +1,6 @@
const { checkLogin, setSetting, setting, doubleCheckPassword } = require("../util-server"); const { checkLogin, setSetting, setting, doubleCheckPassword } = require("../util-server");
const { CloudflaredTunnel } = require("node-cloudflared-tunnel"); const { CloudflaredTunnel } = require("node-cloudflared-tunnel");
const { UptimeKumaServer } = require("../uptime-kuma-server"); const { UptimeKumaServer } = require("../uptime-kuma-server");
const { log } = require("../../src/util");
const io = UptimeKumaServer.getInstance().io; const io = UptimeKumaServer.getInstance().io;
const prefix = "cloudflared_"; const prefix = "cloudflared_";
@ -64,10 +63,7 @@ module.exports.cloudflaredSocketHandler = (socket) => {
socket.on(prefix + "stop", async (currentPassword, callback) => { socket.on(prefix + "stop", async (currentPassword, callback) => {
try { try {
checkLogin(socket); checkLogin(socket);
const disabledAuth = await setting("disableAuth"); await doubleCheckPassword(socket, currentPassword);
if (!disabledAuth) {
await doubleCheckPassword(socket, currentPassword);
}
cloudflared.stop(); cloudflared.stop();
} catch (error) { } catch (error) {
callback({ callback({
@ -108,7 +104,7 @@ module.exports.autoStart = async (token) => {
/** Stop cloudflared */ /** Stop cloudflared */
module.exports.stop = async () => { module.exports.stop = async () => {
log.info("cloudflared", "Stop cloudflared"); console.log("Stop cloudflared");
if (cloudflared) { if (cloudflared) {
cloudflared.stop(); cloudflared.stop();
} }

View File

@ -1,79 +0,0 @@
const { sendDockerHostList } = require("../client");
const { checkLogin } = require("../util-server");
const { DockerHost } = require("../docker");
const { log } = require("../../src/util");
/**
* Handlers for docker hosts
* @param {Socket} socket Socket.io instance
*/
module.exports.dockerSocketHandler = (socket) => {
socket.on("addDockerHost", async (dockerHost, dockerHostID, callback) => {
try {
checkLogin(socket);
let dockerHostBean = await DockerHost.save(dockerHost, dockerHostID, socket.userID);
await sendDockerHostList(socket);
callback({
ok: true,
msg: "Saved",
id: dockerHostBean.id,
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("deleteDockerHost", async (dockerHostID, callback) => {
try {
checkLogin(socket);
await DockerHost.delete(dockerHostID, socket.userID);
await sendDockerHostList(socket);
callback({
ok: true,
msg: "Deleted",
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("testDockerHost", async (dockerHost, callback) => {
try {
checkLogin(socket);
let amount = await DockerHost.testDockerHost(dockerHost);
let msg;
if (amount >= 1) {
msg = "Connected Successfully. Amount of containers: " + amount;
} else {
msg = "Connected Successfully, but there are no containers?";
}
callback({
ok: true,
msg,
});
} catch (e) {
log.error("docker", e);
callback({
ok: false,
msg: e.message,
});
}
});
};

View File

@ -1,20 +0,0 @@
const { log } = require("../../src/util");
const { Settings } = require("../settings");
const { sendInfo } = require("../client");
const { checkLogin } = require("../util-server");
module.exports.generalSocketHandler = (socket, server) => {
socket.on("initServerTimezone", async (timezone) => {
try {
checkLogin(socket);
log.debug("generalSocketHandler", "Timezone: " + timezone);
await Settings.set("initServerTimezone", true);
await server.setTimezone(timezone);
await sendInfo(socket);
} catch (e) {
log.warn("initServerTimezone", e.message);
}
});
};

View File

@ -1,317 +0,0 @@
const { checkLogin } = require("../util-server");
const { log } = require("../../src/util");
const { R } = require("redbean-node");
const apicache = require("../modules/apicache");
const { UptimeKumaServer } = require("../uptime-kuma-server");
const Maintenance = require("../model/maintenance");
const server = UptimeKumaServer.getInstance();
const MaintenanceTimeslot = require("../model/maintenance_timeslot");
/**
* Handlers for Maintenance
* @param {Socket} socket Socket.io instance
*/
module.exports.maintenanceSocketHandler = (socket) => {
// Add a new maintenance
socket.on("addMaintenance", async (maintenance, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", maintenance);
let bean = Maintenance.jsonToBean(R.dispense("maintenance"), maintenance);
bean.user_id = socket.userID;
let maintenanceID = await R.store(bean);
await MaintenanceTimeslot.generateTimeslot(bean);
await server.sendMaintenanceList(socket);
callback({
ok: true,
msg: "Added Successfully.",
maintenanceID,
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
// Edit a maintenance
socket.on("editMaintenance", async (maintenance, callback) => {
try {
checkLogin(socket);
let bean = await R.findOne("maintenance", " id = ? ", [ maintenance.id ]);
if (bean.user_id !== socket.userID) {
throw new Error("Permission denied.");
}
Maintenance.jsonToBean(bean, maintenance);
await R.store(bean);
await MaintenanceTimeslot.generateTimeslot(bean, null, true);
await server.sendMaintenanceList(socket);
callback({
ok: true,
msg: "Saved.",
maintenanceID: bean.id,
});
} catch (e) {
console.error(e);
callback({
ok: false,
msg: e.message,
});
}
});
// Add a new monitor_maintenance
socket.on("addMonitorMaintenance", async (maintenanceID, monitors, callback) => {
try {
checkLogin(socket);
await R.exec("DELETE FROM monitor_maintenance WHERE maintenance_id = ?", [
maintenanceID
]);
for await (const monitor of monitors) {
let bean = R.dispense("monitor_maintenance");
bean.import({
monitor_id: monitor.id,
maintenance_id: maintenanceID
});
await R.store(bean);
}
apicache.clear();
callback({
ok: true,
msg: "Added Successfully.",
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
// Add a new monitor_maintenance
socket.on("addMaintenanceStatusPage", async (maintenanceID, statusPages, callback) => {
try {
checkLogin(socket);
await R.exec("DELETE FROM maintenance_status_page WHERE maintenance_id = ?", [
maintenanceID
]);
for await (const statusPage of statusPages) {
let bean = R.dispense("maintenance_status_page");
bean.import({
status_page_id: statusPage.id,
maintenance_id: maintenanceID
});
await R.store(bean);
}
apicache.clear();
callback({
ok: true,
msg: "Added Successfully.",
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("getMaintenance", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Get Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
let bean = await R.findOne("maintenance", " id = ? AND user_id = ? ", [
maintenanceID,
socket.userID,
]);
callback({
ok: true,
maintenance: await bean.toJSON(),
});
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("getMaintenanceList", async (callback) => {
try {
checkLogin(socket);
await server.sendMaintenanceList(socket);
callback({
ok: true,
});
} catch (e) {
console.error(e);
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("getMonitorMaintenance", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Get Monitors for Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
let monitors = await R.getAll("SELECT monitor.id, monitor.name FROM monitor_maintenance mm JOIN monitor ON mm.monitor_id = monitor.id WHERE mm.maintenance_id = ? ", [
maintenanceID,
]);
callback({
ok: true,
monitors,
});
} catch (e) {
console.error(e);
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("getMaintenanceStatusPage", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Get Status Pages for Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
let statusPages = await R.getAll("SELECT status_page.id, status_page.title FROM maintenance_status_page msp JOIN status_page ON msp.status_page_id = status_page.id WHERE msp.maintenance_id = ? ", [
maintenanceID,
]);
callback({
ok: true,
statusPages,
});
} catch (e) {
console.error(e);
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("deleteMaintenance", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Delete Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
if (maintenanceID in server.maintenanceList) {
delete server.maintenanceList[maintenanceID];
}
await R.exec("DELETE FROM maintenance WHERE id = ? AND user_id = ? ", [
maintenanceID,
socket.userID,
]);
apicache.clear();
callback({
ok: true,
msg: "Deleted Successfully.",
});
await server.sendMaintenanceList(socket);
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("pauseMaintenance", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Pause Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
await R.exec("UPDATE maintenance SET active = 0 WHERE id = ? ", [
maintenanceID,
]);
apicache.clear();
callback({
ok: true,
msg: "Paused Successfully.",
});
await server.sendMaintenanceList(socket);
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
socket.on("resumeMaintenance", async (maintenanceID, callback) => {
try {
checkLogin(socket);
log.debug("maintenance", `Resume Maintenance: ${maintenanceID} User ID: ${socket.userID}`);
await R.exec("UPDATE maintenance SET active = 1 WHERE id = ? ", [
maintenanceID,
]);
apicache.clear();
callback({
ok: true,
msg: "Resume Successfully",
});
await server.sendMaintenanceList(socket);
} catch (e) {
callback({
ok: false,
msg: e.message,
});
}
});
};

Some files were not shown because too many files have changed in this diff Show More