代码编织梦想

 下载

curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.2.4/docker-compose.yaml'

 

环境

mkdir -p ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)" > .env

 

启动

docker-compose -f docker-compose.yaml up
[root@VM-0-13-centos airflow]# docker-compose -f docker-compose.yaml up
Creating network "airflow_default" with the default driver
Creating volume "airflow_postgres-db-volume" with default driver
Pulling postgres (postgres:13)...
13: Pulling from library/postgres
a603fa5e3b41: Pull complete
02d7a77348fd: Pull complete
Digest: sha256:3c6a77caf1ef2ae91ef1a2cdc2ae219e65e9ea274fbfa0d44af3ec0fccef0d8d
Status: Downloaded newer image for postgres:13
Pulling redis (redis:latest)...
latest: Pulling from library/redis
a603fa5e3b41: Already exists
828da1afb5be: Pull complete
Digest: sha256:1e3207c292225b6dd21cb74d59255748a50e8f739dd983040df38fa913927cf1
Status: Downloaded newer image for redis:latest
Pulling airflow-init (apache/airflow:2.2.4)...
2.2.4: Pulling from apache/airflow
6552179c3509: Pull complete
c4887dad22fd: Pull complete
Digest: sha256:72a2cdcdabbc622c30940f1a9f262d047fdbbe96d5d7a4e324b8c7ec5ef56171
Status: Downloaded newer image for apache/airflow:2.2.4
Creating airflow_redis_1    ... done
Creating airflow_postgres_1 ... done
Creating airflow_airflow-init_1 ... done
Creating airflow_airflow-worker_1    ... done
Creating airflow_flower_1            ... done
Creating airflow_airflow-scheduler_1 ... done
Creating airflow_airflow-triggerer_1 ... done
Creating airflow_airflow-webserver_1 ... done
Attaching to airflow_redis_1, airflow_postgres_1, airflow_airflow-init_1, airflow_airflow-worker_1, airflow_flower_1, airflow_airflow-scheduler_1, airflow_airflow-triggerer_1, airflow_airflow-webserver_1
airflow-scheduler_1  | The container is run as root user. For security, consider using a regular user account.
airflow-init_1       | The container is run as root user. For security, consider using a regular user account.
airflow-init_1       | 
airflow-init_1       | [2022-11-22 14:50:04,935] {cli_action_loggers.py:105} WARNING - Failed to log action with (psycopg2.errors.UndefinedTable) relation "log" does not exist
airflow-init_1       | LINE 1: INSERT INTO log (dttm, dag_id, task_id, event, execution_dat...
airflow-init_1       |                     ^
airflow-init_1       | 
airflow-init_1       | [SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (%(dttm)s, %(dag_id)s, %(task_id)s, %(event)s, %(execution_date)s, %(owner)s, %(extra)s) RETURNING log.id]
airflow-init_1       | [parameters: {'dttm': datetime.datetime(2022, 11, 22, 14, 50, 4, 923049, tzinfo=Timezone('UTC')), 'dag_id': None, 'task_id': None, 'event': 'cli_upgradedb', 'execution_date': None, 'owner': 'root', 'extra': '{"host_name": "4cf9127445e0", "full_command": "[\'/home/airflow/.local/bin/airflow\', \'db\', \'upgrade\']"}'}]
airflow-init_1       | (Background on this error at: http://sqlalche.me/e/13/f405)
airflow-init_1       | DB: postgresql+psycopg2://airflow:***@postgres/airflow
airflow-init_1       | [2022-11-22 14:50:05,323] {db.py:919} INFO - Creating tables
airflow-init_1       | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
airflow-init_1       | INFO  [alembic.runtime.migration] Will assume transactional DDL.
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade  -> e3a246e0dc1, current schema
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, create is_encrypted
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 13eb55f81627, maintain history for compatibility with earlier migrations
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 338e90f54d61, More logging into task_instance
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade 849da589634d -> 2c6edca13270, Resource based permissions.
airflow-init_1       | [2022-11-22 14:50:11,631] {manager.py:245} INFO - Inserted Role: Admin
airflow-init_1       | [2022-11-22 14:50:11,635] {manager.py:245} INFO - Inserted Role: Public
airflow-init_1       | [2022-11-22 14:50:11,637] {manager.py:779} WARNING - No user yet created, use flask fab command to do it.
airflow-init_1       | [2022-11-22 14:50:12,893] {manager.py:496} INFO - Created Permission View: can delete on Connections
airflow-init_1       | [2022-11-22 14:50:12,902] {manager.py:496} INFO - Created Permission View: can read on Connections
airflow-init_1       | [2022-11-22 14:50:13,694] {manager.py:245} INFO - Inserted Role: Viewer
airflow-init_1       | [2022-11-22 14:50:13,702] {manager.py:558} INFO - Added Permission can read on Audit Logs to role Viewer
airflow-init_1       | [2022-11-22 14:50:22,884] {manager.py:496} INFO - Created Permission View: can create on XComs
airflow-init_1       | [2022-11-22 14:50:22,889] {manager.py:558} INFO - Added Permission can create on XComs to role Admin
airflow-init_1       | [2022-11-22 14:50:24,588] {manager.py:214} INFO - Added user airflow
airflow-init_1       | User "airflow" created with role "Admin"
airflow-init_1       | 2.2.4
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade be2bfac3da23 -> c381b21cb7e4, add session table to db
airflow-init_1       | INFO  [alembic.runtime.migration] Running upgrade c381b21cb7e4 -> 587bdf053233, adding index for dag_id in job
airflow_airflow-init_1 exited with code 0
airflow-triggerer_1  | The container is run as root user. For security, consider using a regular user account.
flower_1             | The container is run as root user. For security, consider using a regular user account.
airflow-webserver_1  | The container is run as root user. For security, consider using a regular user account.
postgres_1           | The files belonging to this database system will be owned by user "postgres".
postgres_1           | This user must also own the server process.
postgres_1           | 
postgres_1           | The database cluster will be initialized with locale "en_US.utf8".
postgres_1           | The default database encoding has accordingly been set to "UTF8".
postgres_1           | The default text search configuration will be set to "english".
postgres_1           | 
postgres_1           | Data page checksums are disabled.
postgres_1           | 
postgres_1           | fixing permissions on existing directory /var/lib/postgresql/data ... ok
postgres_1           | creating subdirectories ... ok
postgres_1           | selecting dynamic shared memory implementation ... posix
postgres_1           | selecting default max_connections ... 100
postgres_1           | selecting default shared_buffers ... 128MB
postgres_1           | selecting default time zone ... Etc/UTC
postgres_1           | creating configuration files ... ok
postgres_1           | running bootstrap script ... ok
postgres_1           | performing post-bootstrap initialization ... ok
postgres_1           | syncing data to disk ... initdb: warning: enabling "trust" authentication for local connections
postgres_1           | You can change this by editing pg_hba.conf or using the option -A, or
postgres_1           | --auth-local and --auth-host, the next time you run initdb.
postgres_1           | ok
postgres_1           | 
postgres_1           | 
postgres_1           | Success. You can now start the database server using:
postgres_1           | 
postgres_1           |     pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres_1           | 
postgres_1           | waiting for server to start....2022-11-22 14:49:50.399 UTC [49] LOG:  starting PostgreSQL 13.9 (Debian 13.9-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
postgres_1           | 2022-11-22 14:49:50.403 UTC [49] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1           | 2022-11-22 14:49:50.424 UTC [50] LOG:  database system was shut down at 2022-11-22 14:49:50 UTC
postgres_1           | 2022-11-22 14:49:50.431 UTC [49] LOG:  database system is ready to accept connections
postgres_1           |  done
postgres_1           | server started
postgres_1           | CREATE DATABASE
postgres_1           | 
postgres_1           | 
postgres_1           | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
postgres_1           | 
postgres_1           | 2022-11-22 14:49:50.694 UTC [49] LOG:  received fast shutdown request
postgres_1           | waiting for server to shut down....2022-11-22 14:49:50.698 UTC [49] LOG:  aborting any active transactions
postgres_1           | 2022-11-22 14:49:50.700 UTC [49] LOG:  background worker "logical replication launcher" (PID 56) exited with exit code 1
postgres_1           | 2022-11-22 14:49:50.700 UTC [51] LOG:  shutting down
postgres_1           | 2022-11-22 14:49:50.723 UTC [49] LOG:  database system is shut down
postgres_1           |  done
postgres_1           | server stopped
postgres_1           | 
postgres_1           | PostgreSQL init process complete; ready for start up.
postgres_1           | 
postgres_1           | 2022-11-22 14:49:50.824 UTC [1] LOG:  starting PostgreSQL 13.9 (Debian 13.9-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
postgres_1           | 2022-11-22 14:49:50.824 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgres_1           | 2022-11-22 14:49:50.824 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgres_1           | 2022-11-22 14:49:50.833 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1           | 2022-11-22 14:49:50.844 UTC [63] LOG:  database system was shut down at 2022-11-22 14:49:50 UTC
postgres_1           | 2022-11-22 14:49:50.852 UTC [1] LOG:  database system is ready to accept connections
postgres_1           | 2022-11-22 14:50:02.663 UTC [84] ERROR:  relation "log" does not exist at character 13
postgres_1           | 2022-11-22 14:50:02.663 UTC [84] STATEMENT:  INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES ('2022-11-22T14:50:02.646498+00:00'::timestamptz, NULL, NULL, 'cli_check', NULL, 'root', '{"host_name": "4cf9127445e0", "full_command": "[''/home/airflow/.local/bin/airflow'', ''db'', ''check'']"}') RETURNING log.id
postgres_1           | 2022-11-22 14:50:04.935 UTC [92] ERROR:  relation "log" does not exist at character 13
postgres_1           | 2022-11-22 14:50:04.935 UTC [92] STATEMENT:  INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES ('2022-11-22T14:50:04.923049+00:00'::timestamptz, NULL, NULL, 'cli_upgradedb', NULL, 'root', '{"host_name": "4cf9127445e0", "full_command": "[''/home/airflow/.local/bin/airflow'', ''db'', ''upgrade'']"}') RETURNING log.id
postgres_1           | 2022-11-22 14:50:05.315 UTC [92] ERROR:  relation "connection" does not exist at character 55
postgres_1           | 2022-11-22 14:50:05.315 UTC [92] STATEMENT:  SELECT connection.conn_id AS connection_conn_id 
postgres_1           | 	FROM connection GROUP BY connection.conn_id 
postgres_1           | 	HAVING count(*) > 1
postgres_1           | 2022-11-22 14:50:05.317 UTC [92] ERROR:  relation "connection" does not exist at character 55
postgres_1           | 2022-11-22 14:50:05.317 UTC [92] STATEMENT:  SELECT connection.conn_id AS connection_conn_id 
postgres_1           | 	FROM connection 
postgres_1           | 	WHERE connection.conn_type IS NULL
postgres_1           | 2022-11-22 14:50:17.811 UTC [109] WARNING:  you don't own a lock of type ExclusiveLock
redis_1              | 1:C 22 Nov 2022 14:49:49.314 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis_1              | 1:C 22 Nov 2022 14:49:49.314 # Redis version=7.0.5, bits=64, commit=00000000, modified=0, pid=1, just started
redis_1              | 1:C 22 Nov 2022 14:49:49.314 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis_1              | 1:M 22 Nov 2022 14:49:49.315 * monotonic clock: POSIX clock_gettime
redis_1              | 1:M 22 Nov 2022 14:49:49.316 * Running mode=standalone, port=6379.
redis_1              | 1:M 22 Nov 2022 14:49:49.316 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
redis_1              | 1:M 22 Nov 2022 14:49:49.316 # Server initialized
redis_1              | 1:M 22 Nov 2022 14:49:49.316 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1              | 1:M 22 Nov 2022 14:49:49.316 * Ready to accept connections
airflow-worker_1     | 
flower_1             | 
airflow-scheduler_1  | 
airflow-webserver_1  | 
airflow-triggerer_1  | 
airflow-triggerer_1  |   ____________       _____________
airflow-triggerer_1  |  ____    |__( )_________  __/__  /________      __
airflow-triggerer_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
airflow-triggerer_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
airflow-triggerer_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
airflow-triggerer_1  | [2022-11-22 14:50:42,995] {triggerer_job.py:101} INFO - Starting the triggerer
airflow-worker_1     | BACKEND=redis
airflow-worker_1     | DB_HOST=redis
airflow-worker_1     | DB_PORT=6379
airflow-worker_1     | 
airflow-scheduler_1  | BACKEND=redis
airflow-scheduler_1  | DB_HOST=redis
airflow-scheduler_1  | DB_PORT=6379
airflow-scheduler_1  | 
flower_1             | BACKEND=redis
flower_1             | DB_HOST=redis
flower_1             | DB_PORT=6379
flower_1             | 
airflow-scheduler_1  |   ____________       _____________
airflow-scheduler_1  |  ____    |__( )_________  __/__  /________      __
airflow-scheduler_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
airflow-scheduler_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
airflow-scheduler_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
airflow-scheduler_1  | [2022-11-22 14:50:47,025] {scheduler_job.py:619} INFO - Starting the scheduler
airflow-scheduler_1  | [2022-11-22 14:50:47,025] {scheduler_job.py:624} INFO - Processing each file at most -1 times
airflow-scheduler_1  | [2022-11-22 14:50:47,524] {manager.py:163} INFO - Launched DagFileProcessorManager with pid: 41
airflow-scheduler_1  | [2022-11-22 14:50:47,526] {scheduler_job.py:1137} INFO - Resetting orphaned tasks for active dag runs
airflow-scheduler_1  | [2022-11-22 14:50:47,543] {settings.py:55} INFO - Configured default timezone Timezone('UTC')
airflow-worker_1     | [2022-11-22 14:50:47 +0000] [39] [INFO] Starting gunicorn 20.1.0
airflow-worker_1     | [2022-11-22 14:50:47 +0000] [39] [INFO] Listening at: http://0.0.0.0:8793 (39)
airflow-worker_1     | [2022-11-22 14:50:47 +0000] [39] [INFO] Using worker: sync
airflow-worker_1     | [2022-11-22 14:50:47 +0000] [40] [INFO] Booting worker with pid: 40
airflow-worker_1     | [2022-11-22 14:50:47 +0000] [41] [INFO] Booting worker with pid: 41
airflow-worker_1     | /home/airflow/.local/lib/python3.7/site-packages/celery/platforms.py:841 SecurityWarning: You're running the worker with superuser privileges: this is
airflow-worker_1     | absolutely not recommended!
airflow-worker_1     | 
airflow-worker_1     | Please specify a different user using the --uid option.
airflow-worker_1     | 
airflow-worker_1     | User information: uid=0 euid=0 gid=0 egid=0
airflow-worker_1     | 
airflow-worker_1     |  
airflow-worker_1     |  -------------- celery@d6a0f59d34b5 v5.2.3 (dawn-chorus)
airflow-worker_1     | --- ***** ----- 
airflow-worker_1     | -- ******* ---- Linux-4.14.105-cc1e1d6+-x86_64-with-debian-10.11 2022-11-22 14:50:50
airflow-worker_1     | - *** --- * --- 
airflow-worker_1     | - ** ---------- [config]
airflow-worker_1     | - ** ---------- .> app:         airflow.executors.celery_executor:0x7fdacfbabe90
airflow-worker_1     | - ** ---------- .> transport:   redis://redis:6379/0
airflow-worker_1     | - ** ---------- .> results:     postgresql://airflow:**@postgres/airflow
airflow-worker_1     | - *** --- * --- .> concurrency: 16 (prefork)
airflow-worker_1     | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
airflow-worker_1     | --- ***** ----- 
airflow-worker_1     |  -------------- [queues]
airflow-worker_1     |                 .> default          exchange=default(direct) key=default
airflow-worker_1     |                 
airflow-worker_1     | 
airflow-worker_1     | [tasks]
airflow-worker_1     |   . airflow.executors.celery_executor.execute_command
airflow-worker_1     | 
flower_1             | [2022-11-22 14:50:50,854] {command.py:154} INFO - Visit me at http://0.0.0.0:5555
flower_1             | [2022-11-22 14:50:50,935] {command.py:159} INFO - Broker: redis://redis:6379/0
flower_1             | [2022-11-22 14:50:50,945] {command.py:162} INFO - Registered tasks: 
flower_1             | ['airflow.executors.celery_executor.execute_command',
flower_1             |  'celery.accumulate',
flower_1             |  'celery.backend_cleanup',
flower_1             |  'celery.chain',
flower_1             |  'celery.chord',
flower_1             |  'celery.chord_unlock',
flower_1             |  'celery.chunks',
flower_1             |  'celery.group',
flower_1             |  'celery.map',
flower_1             |  'celery.starmap']
flower_1             | [2022-11-22 14:50:51,056] {mixins.py:225} INFO - Connected to redis://redis:6379/0
flower_1             | [2022-11-22 14:50:52,119] {inspector.py:42} WARNING - Inspect method active_queues failed
flower_1             | [2022-11-22 14:50:52,120] {inspector.py:42} WARNING - Inspect method registered failed
flower_1             | [2022-11-22 14:50:52,133] {inspector.py:42} WARNING - Inspect method scheduled failed
flower_1             | [2022-11-22 14:50:52,139] {inspector.py:42} WARNING - Inspect method stats failed
flower_1             | [2022-11-22 14:50:52,165] {inspector.py:42} WARNING - Inspect method active failed
flower_1             | [2022-11-22 14:50:52,166] {inspector.py:42} WARNING - Inspect method reserved failed
flower_1             | [2022-11-22 14:50:52,172] {inspector.py:42} WARNING - Inspect method revoked failed
flower_1             | [2022-11-22 14:50:52,173] {inspector.py:42} WARNING - Inspect method conf failed
airflow-webserver_1  |   ____________       _____________
airflow-webserver_1  |  ____    |__( )_________  __/__  /________      __
airflow-webserver_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
airflow-webserver_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
airflow-webserver_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
airflow-webserver_1  | [2022-11-22 14:50:53,199] {dagbag.py:500} INFO - Filling up the DagBag from /dev/null
airflow-worker_1     | [2022-11-22 14:50:53,299: INFO/MainProcess] Connected to redis://redis:6379/0
airflow-worker_1     | [2022-11-22 14:50:53,314: INFO/MainProcess] mingle: searching for neighbors
airflow-webserver_1  | [2022-11-22 14:50:53,790] {manager.py:512} WARNING - Refused to delete permission view, assoc with role exists DAG Runs.can_create Admin
airflow-worker_1     | [2022-11-22 14:50:54,334: INFO/MainProcess] mingle: all alone
airflow-worker_1     | [2022-11-22 14:50:54,355: INFO/MainProcess] celery@d6a0f59d34b5 ready.
airflow-worker_1     | [2022-11-22 14:50:55,960: INFO/MainProcess] Events of group {task} enabled by remote.
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [38] [INFO] Starting gunicorn 20.1.0
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [38] [INFO] Listening at: http://0.0.0.0:8080 (38)
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [38] [INFO] Using worker: sync
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [41] [INFO] Booting worker with pid: 41
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [42] [INFO] Booting worker with pid: 42
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [43] [INFO] Booting worker with pid: 43
airflow-webserver_1  | [2022-11-22 14:50:58 +0000] [44] [INFO] Booting worker with pid: 44
airflow-webserver_1  | [2022-11-22 14:51:07,268] {manager.py:512} WARNING - Refused to delete permission view, assoc with role exists DAG Runs.can_create Admin
airflow-webserver_1  | [2022-11-22 14:51:07,513] {manager.py:512} WARNING - Refused to delete permission view, assoc with role exists DAG Runs.can_create Admin
airflow-webserver_1  | [2022-11-22 14:51:07,558] {manager.py:512} WARNING - Refused to delete permission view, assoc with role exists DAG Runs.can_create Admin
airflow-webserver_1  | [2022-11-22 14:51:07,961] {manager.py:512} WARNING - Refused to delete permission view, assoc with role exists DAG Runs.can_create Admin
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:13 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:19 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:29 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:39 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-triggerer_1  | [2022-11-22 14:51:43,090] {triggerer_job.py:251} INFO - 0 triggers currently running
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:49 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:51:59 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:52:09 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:52:19 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:21 +0000] "GET / HTTP/1.1" 302 217 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:21 +0000] "GET /home HTTP/1.1" 302 311 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:52:39 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2022-11-22 14:52:42,653] {manager.py:228} INFO - Updated user Airflow Admin
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:42 +0000] "POST /login/?next=http%3A%2F%2F121.4.223.80%3A8080%2Fhome HTTP/1.1" 302 265 "http://121.4.223.80:8080/login/?next=http%3A%2F%2F121.4.223.80%3A8080%2Fhome" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | /home/airflow/.local/lib/python3.7/site-packages/airflow/www/utils.py:110 DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead.
airflow-webserver_1  | /home/airflow/.local/lib/python3.7/site-packages/airflow/www/utils.py:204 DeprecationWarning: 'jinja2.Markup' is deprecated and will be removed in Jinja 3.1. Import 'markupsafe.Markup' instead.
airflow-triggerer_1  | [2022-11-22 14:52:43,157] {triggerer_job.py:251} INFO - 0 triggers currently running
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:43 +0000] "GET /static/dist/switch.1722a49cd9c0ba00856d.css HTTP/1.1" 200 0 "http://121.4.223.80:8080/home" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:43 +0000] "GET /static/dist/dags.41c756828992e95c9478.css HTTP/1.1" 200 0 "http://121.4.223.80:8080/home" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | 111.18.55.147 - - [22/Nov/2022:14:52:46 +0000] "POST /task_stats HTTP/1.1" 200 38507 "http://121.4.223.80:8080/home" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.0.0 Safari/537.36"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:52:50 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [22/Nov/2022:14:53:00 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
^CGracefully stopping... (press Ctrl+C again to force)
Stopping airflow_airflow-webserver_1 ... done
Stopping airflow_airflow-triggerer_1 ... done
Stopping airflow_airflow-scheduler_1 ... done
Stopping airflow_flower_1            ... done
Stopping airflow_airflow-worker_1    ... done
Stopping airflow_postgres_1          ... done
Stopping airflow_redis_1             ... done

浏览器访问:localhost:8080 

用户:airflow,密码:airflow 

 

docker ps -a 

 

 

 

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/DataIntel_XiAn/article/details/127991755

airflow快速开始-爱代码爱编程

文章目录 apache-airflow文档地址安装碰到的问题libmkl_core.dylib, 9): image not found.缺失No module named 'kubernetes'快速开始airflow架构airflow组件附上pip freeze什么是 AirflowAirflow 的服务构成WebServerWorkerSch

airflow调度框架-爱代码爱编程

airflow调度框架 1.认识大数据1.1、什么是大数据1.2、大数据分析应用场景2.任务调度相关概念2.1、什么是任务调度2.1.1、任务调度:实现执行程序的、规范化、自动化、可视化、集中化、统一调度和监控,让所有任务有序、高效运行,降低开发和运维成本。2.1.2、分布式任务调度:任务的分布式处理,多台服务器同时处理任务的调度和监控,体现分布式

airflow 2.0.2 python依赖清单-爱代码爱编程

# Editable install with no version control (apache-airflow==2.0.2) APScheduler==3.6.3 Authlib==0.15.3 Babel==2.9.0 Flask-AppBuilder==3.2.3 Flask-Babel==1.0.0 Flask-Bcrypt==0.7.1 F

数据仓库之电商数仓-- 1、用户行为数据采集-爱代码爱编程

目录 一、数据仓库概念二、项目需求及架构设计2.1 项目需求分析2.2 项目框架2.2.1 技术选型2.2.2 系统数据流程设计2.2.3 框架版本选型2.2.4 服务器选型2.2.5 集群规模2.2.6 集群资源规划设计三、数据生成模块3.1 目标数据3.1.1 页面日志3.1.2 事件日志3.1.3 曝光日志3.1.4 启动日志3.1.5 错误

Anaconda虚拟环境中安装airflow-爱代码爱编程

环境: 虚拟机IP:192.168.43.61 宿主机IP:192.168.43.192 操作系统:centos-release-7-2.1511.el7.centos.2.10.x86_64 虚拟机:VirtualBox 6.1.24 Anaconda版本:Anaconda3-5.2.0-Linux-x86_64 1.设置Pip镜像源      

Airflow搭建与使用-爱代码爱编程

Airflow 是一个编排、调度和监控workflow的平台,由Airbnb开源,现在在Apache Software Foundation 孵化。Airflow 将workflow编排为由tasks组成的DAGs(有向无环图),调度器在一组workers上按照指定的依赖关系执行tasks。同时,Airflow 提供了丰富的命令行工具和简单易用的用户界面以

Airflow离线升级(1.10.0->2.2.3)文档-爱代码爱编程

Airflow离线升级(1.10.0->2.2.3)文档 钉钉交流群:35701608 钉钉交流群:35701608 钉钉交流群:35701608 计划 离线部署airflow1.10.0 #目前生产环境的版本 升级至1.10.15 #官方建议的桥接版本 升级到2.2.3 目前环境介绍 python 3.6.5 mysql 5.7 li

apache airflow os命令注入(cve-2022-24288)_whatever`的博客-爱代码爱编程

一、漏洞描述 Apache Airflow是美国阿帕奇(Apache)基金会的一套用于创建、管理和监控工作流程的开源平台。该平台具有可扩展和动态监控等特点。 Apache Airflow 存在操作系统命令注入漏洞,该漏洞的存在是由于某些示例dag中不正确的输入验证。远程未经身份验证的攻击者可利用该漏洞可以传递专门制作的HTTP请求,并在目标系统上执行任

cve-2022-24288:apache airflow os命令注入漏洞_evan kang的博客-爱代码爱编程

目录 声明: 简介: 漏洞概述: 影响版本: 环境搭建: Docker环境: 漏洞环境: 漏洞复现: 登录airflow  利用漏洞(1):  利用漏洞(2): 修复方式: 声明:         仅供学习参考使用,请勿用作违法用途,否则后果自负。 简介:         Apache Airflow是美国阿帕奇(Ap

离线数仓搭架_01_数仓概念与项目框架说明_程序员小憨的博客-爱代码爱编程

文章目录 1.0 数据仓库概念2.0 项目 需求及架构设计2.1项目需求2.2项目框架2.2.1技术选型2.2.2 系统数据流程设计2.2.3 框架版本选型2.2.4服务器选型2.2.5 集群规模2.2.6 集群资源

无胁科技-tvd每日漏洞情报-2022-11-14_无胁科技blog的博客-爱代码爱编程

漏洞名称:CandidATS 跨站脚本漏洞 漏洞级别:中危 漏洞编号:CVE-2022-42749;CNNVD-202211-2030 相关涉及:CandidATS 3.0.0版本 漏洞状态:POC 参考链接:https://tvd.wuthreat.com/#/listDetail?TVD_ID=TVD-2022-19924 漏洞名称:GL.iNet

2 大数据电商数仓项目——项目需求及架构设计_啊啊阿南啊的博客-爱代码爱编程

2 大数据电商数仓项目——项目需求及架构设计 2.1 项目需求分析 用户行为数据采集平台搭建。业务数据采集平台搭建。数据仓库维度建模(核心):主要设计ODS、DWD、DWS、AWT、ADS等各个层的具体功能与实现方法。分

cve-爱代码爱编程

一、描述 在 Apache Airflow 2.2.4 之前的版本中,一些示例 DAG 没有正确清理用户提供的参数,使其容易受到来自 Web UI 的 OS 命令注入的影响。 二、缓解: 这可以通过确保[core] load_examples设置为来缓解False。 三、 EXP (payload1 2是俩个不同利用点,选择用一个就行) i