Code Monkey home page Code Monkey logo

big-whale's Introduction

Big Whale

巨鲸任务调度平台为美柚大数据研发的分布式计算任务调度系统,提供Spark、Flink等批处理任务的DAG调度和流处理任务的运行管理和状态监控,并具有Yarn应用管理、重复应用检测、大内存应用检测等功能。 服务基于Spring Boot 2.0开发,打包后即可运行。[Github][Gitee]

概述

1.架构图

image

2.特性

  • 基于SSH的脚本执行机制,部署简单快捷,仅需单个服务
  • 基于Yarn Rest Api的任务状态同步机制,对Spark、Flink无版本限制
  • 支持失败重试
  • 支持任务依赖
  • 支持复杂任务编排(DAG)
  • 支持流处理任务运行管理和监控
  • 支持Yarn应用管理

部署

1.准备

  • Java 1.8+
  • Mysql 5.1.0+
  • 下载项目或git clone项目
  • 为解决 github README.md 图片无法正常加载的问题,请在hosts文件中加入相关域名解析规则,参考:hosts

2.安装

  • 创建数据库:big-whale
  • 运行数据库脚本:big-whale.sql
  • 根据Spring Boot环境,配置相关数据库账号密码,以及SMTP信息
  • 配置:big-whale.properties
    • 配置项说明
      • ssh.user: 拥有脚本执行权限的ssh远程登录用户名(平台会将该用户作为统一的脚本执行用户)
      • ssh.password: ssh远程登录用户密码
      • dingding.enabled: 是否开启钉钉告警
      • dingding.watcher-token: 钉钉公共群机器人Token
      • yarn.app-memory-threshold: Yarn应用内存上限(单位:MB),-1禁用检测
      • yarn.app-white-list: Yarn应用白名单列表(列表中的应用申请的内存超过上限,不会进行告警)
  • 修改:$FLINK_HOME/bin/flink,参考:flink(因flink提交任务时只能读取本地jar包,故需要在执行提交命令时从hdfs上下载jar包并替换脚本中的jar包路径参数)
  • 打包:mvn clean package

3.启动

  • 检查端口17070是否被占用,被占用的话,关闭占用的进程或修改项目端口号配置重新打包
  • 拷贝target目录下的big-whale.jar,执行命令:java -jar big-whale.jar

4.初始配置

  • 打开:http://localhost:17070
    image
  • 输入账号admin,密码admin
  • 点击:权限管理->用户管理,修改当前账号的邮箱为合法且存在的邮箱地址,否则会导致邮件发送失败
  • 添加集群
    • 集群管理->集群管理->新增
      image
    • “yarn管理地址”为Yarn ResourceManager的WEB UI地址
    • “程序包存储目录”为程序包上传至hdfs集群时的存储路径,如:/data/big-whale/storage
    • “支持Flink任务代理用户”“流处理任务黑名单”和“批处理任务黑名单”为内部定制的任务分配规则,勿填
  • 添加集群用户
    • 集群管理->集群用户->新增
      image
    • 该配置的语义为:平台用户在所选集群下可以使用的Yarn资源队列(--queue)和代理用户(--proxy-user)
  • 添加代理
    • 集群管理->代理管理->新增
      image
    • 可添加多个实例(仅支持IP地址,可指定端口号,默认为22),执行脚本的时候会随机选择一个实例执行,在实例不可达的情况下,会继续随机选择下一个实例,在实例均不可达时执行失败
    • 选择集群后,会作为该集群下提交Spark或Flink任务的代理之一
  • 添加计算框架版本
    • 集群管理->版本管理->新增
      image
    • 同一集群下不同版本的Spark或Flink任务的提交命令可能有所不同,如Spark 1.6.0版本的提交命令为spark-submit,Spark 2.1.0版本的提交命令为spark2-submit

使用

1.离线调度

1.1 新增

  • 目前支持“Shell”、“Spark Batch”和“Flink Batch”三种类型的批处理任务
  • 通过拖拽左侧工具栏相应的批处理任务图标,可添加相应的DAG节点
    image
    image
    • 支持时间参数${now} ${now - 1d} ${now - 1h@yyyyMMddHHmmss}等(d天、h时、m分、s秒、@yyyyMMddHHmmss为格式化参数)
    • 非“Shell”类型的批处理任务应上传与之处理类型相对应的程序包,此处为Spark批处理任务打成的jar包
    • “资源选项”可不填
    • 代码有两种编辑模式,“可视化视图”和“代码视图”,可互相切换
    • 点击“测试”可测试当前节点是否正确配置并可以正常运行
    • 为防止平台线程被大量占用,平台提交Saprk或Flink任务的时候都会强制以“后台”的方式执行,对应spark配置:--conf spark.yarn.submit.waitAppCompletion=false,flink配置:-d,但是基于后台“作业状态更新任务”的回调,在实现DAG执行引擎时可以确保当前节点所提交的任务运行完成后再执行下一个节点的任务
  • DAG节点支持失败重试
  • 将节点按照一定的顺序连接起来可以构建一个完整的DAG
    image
  • DAG构建完成后,点击“保存”,完成调度设置
    image

1.2 操作

  • 打开离线调度列表 image
  • 点击左侧操作栏“调度实例”可查看调度实例列表、运行状态和节点启动日志
    image
  • 点击左侧操作栏“手动执行”可触发调度执行

2.实时任务

2.1 新增

  • 目前支持“Spark Stream”和“Flink Stream”两种类型的流处理任务
    image
  • 启用监控可以对任务进行状态监控,包括异常重启、批次积压告警等
    image

2.2 操作

  • 打开实时任务列表 image
  • 点击左侧操作栏“日志”可查看任务启动日志
  • 点击左侧操作栏“执行”可触发任务启动

3.任务告警

  • 正确配置邮件或钉钉告警后在任务运行异常时会发送相应的告警邮件或通知,以便及时进行相应的处理
<巨鲸任务告警>
代理: agent1
类型: 脚本执行失败
用户: admin
任务: 调度示例1 - shell_test
时间: 2021-03-05 15:18:23
<巨鲸任务告警>
集群: 集群1
类型: spark离线任务异常(FAILED)
用户: admin
任务: 调度示例1 - spark_test
时间: 2021-03-05 15:28:33
<巨鲸任务告警>
集群: 集群1
类型: spark实时任务批次积压,已重启
用户: admin
任务: sparkstream_test
时间: 2021-03-05 15:30:41
  • 除上述告警信息外还有其他告警信息此处不一一列举

Change log

  • v1.1开始支持DAG
  • v1.2开始支持DAG节点失败重试
  • v1.3调度引擎进行重构升级,不支持从旧版本升级上来,原有旧版本的任务请手动进行迁移,离线调度移除“Python”类型脚本支持

License

The project is licensed under the Apache 2 license.

big-whale's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

big-whale's Issues

队列无法选择

你好!
我在使用离线调度时,有时候会出现无法选择设置完成的队列的情况,不知道是否遇到过

can not retry select for update statement

启动时,就有job报这个,怎么判断哪里出现了问题

2021-12-17 18:18:33.305 ERROR (QuartzScheduler.java:2401)- An error occurred while firing triggers '[Trigger 'DEFAULT.6da64b5bd2ee-babeea54-29a6-4c27-9a4e-9248d97a18e0': triggerClass: 'org.quartz.impl.triggers.CronTriggerImpl calendar: 'null' misfireInstruction: 2 nextFireTime: Fri Dec 17 18:16:50 CST 2021]'
org.quartz.JobPersistenceException: Couldn't commit jdbc connection. [1008803] can not retry select for update statement
at org.quartz.impl.jdbcjobstore.JobStoreSupport.commitConnection(JobStoreSupport.java:3734)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.executeInNonManagedTXLock(JobStoreSupport.java:3851)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.triggersFired(JobStoreSupport.java:2962)
at org.quartz.core.QuartzSchedulerThread.run(QuartzSchedulerThread.java:353)
Caused by: java.sql.SQLException: [1008803] can not retry select for update statement
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:963)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3966)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3902)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2526)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2673)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2545)
at com.mysql.jdbc.ConnectionImpl.commit(ConnectionImpl.java:1614)
at com.zaxxer.hikari.pool.ProxyConnection.commit(ProxyConnection.java:368)
at com.zaxxer.hikari.pool.HikariProxyConnection.commit(HikariProxyConnection.java)
at sun.reflect.GeneratedMethodAccessor132.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.quartz.impl.jdbcjobstore.AttributeRestoringConnectionInvocationHandler.invoke(AttributeRestoringConnectionInvocationHandler.java:73)
at com.sun.proxy.$Proxy116.commit(Unknown Source)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.commitConnection(JobStoreSupport.java:3732)
... 3 common frames omitted

输入输出output参数有什么用

脚本管理中flink批处理 输入输出output参数有什么用?

在代码视图和 后台代码中都没看到这个作用

比如input output参数这样
flink run -m yarn-cluster -yjm 1024 -ytm 1024 -ys 1 -yqu root.users.root -ynm WordCountDemos -d -yD ypu=root /tmp/hdfs_pkg/2020101520/5509WordCount.jar --allocate.balancer.type=available \ -input hdfs://10.3.87.23:8020/data/flink/GoneWiththeWind.txt \ -output hdfs://10.3.87.23:8020/data/flink/wordcount-result-1.txt

In Flink 1.11.2 version , Job starts error

I followed the documentation to setup the big-whale platform,
then when I created FLINK (1.11.2) job, and start it, I got the following errors:

org.apache.flink.streaming.runtime.tasks.StreamTaskException: Could not instantiate outputs in order. at org.apache.flink.streaming.api.graph.StreamConfig.getOutEdgesInOrder(StreamConfig.java:429) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriters(StreamTask.java:1147) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriterDelegate(StreamTask.java:1131) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:284) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:271) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:251) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:244) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:234) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.<init>(OneInputStreamTask.java:62) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_152] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_152] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_152] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_152] at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:1372) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:699) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_152] Caused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.keySelector of type org.apache.flink.api.java.functions.KeySelector in instance of org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233) ~[?:1.8.0_152] at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405) ~[?:1.8.0_152] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288) ~[?:1.8.0_152] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206) ~[?:1.8.0_152] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064) ~[?:1.8.0_152] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) ~[?:1.8.0_152] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282) ~[?:1.8.0_152] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206) ~[?:1.8.0_152] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064) ~[?:1.8.0_152] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) ~[?:1.8.0_152] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428) ~[?:1.8.0_152] at java.util.ArrayList.readObject(ArrayList.java:797) ~[?:1.8.0_152] at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_152] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_152] at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158) ~[?:1.8.0_152] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2173) ~[?:1.8.0_152] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064) ~[?:1.8.0_152] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) ~[?:1.8.0_152] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428) ~[?:1.8.0_152] at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:576) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:562) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:550) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:511) ~[flink-dist_2.11-1.11.2.jar:1.11.2] at org.apache.flink.streaming.api.graph.StreamConfig.getOutEdgesInOrder(StreamConfig.java:426) ~[flink-dist_2.11-1.11.2.jar:1.11.2] ... 16 more

I dig into it and find it is caused by local jar file name format,
when I change the file format from
"$path/${runFile}.date '+%M%S'" to
"$path/date '+%M%S'-${runFile}" ,
it solves the issue.

Addition, I made PR for it: #32 , hopes it helps.

导入数据库,源码编译成功,启动报错,Unable to build Hibernate SessionFactory; nested exception is org.hibernate.MappingException: Could not get constructor for org.hibernate.persister.entity.SingleTableEntityPersister

. ____ _ __ _ _
/\ / ' __ _ () __ __ _ \ \ \
( ( )_
_ | '_ | '| | ' / ` | \ \ \
\/ )| |)| | | | | || (| | ) ) ) )
' |
| .__|| ||| |_, | / / / /
=========|
|==============|/=////
:: Spring Boot :: (v2.0.5.RELEASE)

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.springframework.cglib.core.ReflectUtils$1 (file:/E:/.m2/repository/org/springframework/spring-core/5.0.9.RELEASE/spring-core-5.0.9.RELEASE.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of org.springframework.cglib.core.ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2020-09-09 11:57:17.174 WARN 10980 --- [ main] org.hibernate.id.UUIDHexGenerator : HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2020-09-09 11:57:17.934 WARN 10980 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is javax.persistence.PersistenceException: [PersistenceUnit: default] Unable to build Hibernate SessionFactory; nested exception is org.hibernate.MappingException: Could not get constructor for org.hibernate.persister.entity.SingleTableEntityPersister
2020-09-09 11:57:18.029 ERROR 10980 --- [ main] o.s.boot.SpringApplication : Application run failed

org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'entityManagerFactory' defined in class path resource [org/springframework/boot/autoconfigure/orm/jpa/HibernateJpaConfiguration.class]: Invocation of init method failed; nested exception is javax.persistence.PersistenceException: [PersistenceUnit: default] Unable to build Hibernate SessionFactory; nested exception is org.hibernate.MappingException: Could not get constructor for org.hibernate.persister.entity.SingleTableEntityPersister
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1699) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:573) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:495) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1089) ~[spring-context-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:859) ~[spring-context-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:780) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:412) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:333) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1277) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1265) ~[spring-boot-2.0.5.RELEASE.jar:2.0.5.RELEASE]
at com.meiyouframework.bigwhale.BigWhaleApplication.main(BigWhaleApplication.java:20) ~[classes/:na]
Caused by: javax.persistence.PersistenceException: [PersistenceUnit: default] Unable to build Hibernate SessionFactory; nested exception is org.hibernate.MappingException: Could not get constructor for org.hibernate.persister.entity.SingleTableEntityPersister
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:402) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:377) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1758) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1695) ~[spring-beans-5.0.9.RELEASE.jar:5.0.9.RELEASE]
... 16 common frames omitted
Caused by: org.hibernate.MappingException: Could not get constructor for org.hibernate.persister.entity.SingleTableEntityPersister
at org.hibernate.persister.internal.PersisterFactoryImpl.createEntityPersister(PersisterFactoryImpl.java:123) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.persister.internal.PersisterFactoryImpl.createEntityPersister(PersisterFactoryImpl.java:77) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.metamodel.internal.MetamodelImpl.initialize(MetamodelImpl.java:129) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.internal.SessionFactoryImpl.(SessionFactoryImpl.java:300) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:462) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:892) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:57) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:390) ~[spring-orm-5.0.9.RELEASE.jar:5.0.9.RELEASE]
... 20 common frames omitted
Caused by: org.hibernate.HibernateException: Unable to instantiate default tuplizer [org.hibernate.tuple.entity.PojoEntityTuplizer]
at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:91) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.tuple.entity.EntityTuplizerFactory.constructDefaultTuplizer(EntityTuplizerFactory.java:116) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.tuple.entity.EntityMetamodel.(EntityMetamodel.java:382) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.persister.entity.AbstractEntityPersister.(AbstractEntityPersister.java:519) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.persister.entity.SingleTableEntityPersister.(SingleTableEntityPersister.java:124) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:na]
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[na:na]
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[na:na]
at org.hibernate.persister.internal.PersisterFactoryImpl.createEntityPersister(PersisterFactoryImpl.java:96) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
... 28 common frames omitted
Caused by: java.lang.reflect.InvocationTargetException: null
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:na]
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[na:na]
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[na:na]
at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:88) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
... 38 common frames omitted
Caused by: java.lang.NullPointerException: null
at javassist.util.proxy.SecurityActions.setAccessible(SecurityActions.java:103) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.DefineClassHelper.toClass3(DefineClassHelper.java:151) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.DefineClassHelper.toClass2(DefineClassHelper.java:134) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.DefineClassHelper.toClass(DefineClassHelper.java:95) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.FactoryHelper.toClass(FactoryHelper.java:131) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.ProxyFactory.createClass3(ProxyFactory.java:530) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.ProxyFactory.createClass2(ProxyFactory.java:515) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.ProxyFactory.createClass1(ProxyFactory.java:451) ~[javassist-3.22.0-GA.jar:na]
at javassist.util.proxy.ProxyFactory.createClass(ProxyFactory.java:422) ~[javassist-3.22.0-GA.jar:na]
at org.hibernate.proxy.pojo.javassist.JavassistProxyFactory.postInstantiate(JavassistProxyFactory.java:75) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.tuple.entity.PojoEntityTuplizer.buildProxyFactory(PojoEntityTuplizer.java:162) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.tuple.entity.AbstractEntityTuplizer.(AbstractEntityTuplizer.java:156) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
at org.hibernate.tuple.entity.PojoEntityTuplizer.(PojoEntityTuplizer.java:58) ~[hibernate-core-5.2.17.Final.jar:5.2.17.Final]
... 44 common frames omitted

Process finished with exit code 1

支持查看任务执行历史

目前创建一个任务是将记录保存至scheduling中,而在每一次调度执行的时候,是在scheduling表中原记录进行更新操作,这样设计是不是无法查询任务的执行历史

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.