[TSAApp@aml-qa4-core1 bidmgr]$ ./admax.sh -d5 -a2502 -T2019-01-07 -z America/New_York Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release ------- Loaded DatabaseAccounts properties from file:/usr/local/tsa/bidmgr-distribution-2.9.25-BETA16-dev-t20190108-1248/libs/deployment-bidmgr-shaded-jar-2.9.25-BETA16.jar!/core-jdb-datasource.properties ------- {datasource-user=spike, datasource-password=tar63t} 2019-01-09 13:52:09.134 (3) [main]: Debug Level set to 3 2019-01-09 13:52:09.135 (3) [main]: Licensing environment detected, loading configuration. 2019-01-09 13:52:09.136 (3) [main]: Loaded property: distribution-service-url=http://aml-qa4-front1:8080/distribution 2019-01-09 13:52:09.136 (3) [main]: Loaded property: distribution-service-username=readwrite 2019-01-09 13:52:09.136 (3) [main]: Loaded property: distribution-service-password=readwrite 2019-01-09 13:52:09.136 (3) [main]: Loaded property: sapi-auth-url=http://aml-qa4-front1:8080/sapi/rest/security/tokens 2019-01-09 13:52:09.136 (3) [main]: Loaded property: sapi-merchant-service-url=http://aml-qa4-front1:8080/sapi/rest/merchants 2019-01-09 13:52:09.136 (3) [main]: Loaded property: sapi-reseller-service-url=http://aml-qa4-front1:8080/sapi/rest/resellers 2019-01-09 13:52:09.136 (3) [main]: Loaded property: sapi-username=admin+tsa@thesearchagency.com 2019-01-09 13:52:09.137 (3) [main]: Loaded property: sapi-password=admin 2019-01-09 13:52:09.137 (3) [main]: Loaded property: sapi-realm=thesearchagency 2019-01-09 13:52:09.137 (3) [main]: Loaded property: tsa.google.mcc.developerToken=ojCnjZaM6RD1h0yt_DOJZg 2019-01-09 13:52:09.137 (3) [main]: Loaded property: tsa.google.mcc.isReseller=true 2019-01-09 13:52:09.137 (3) [main]: Loaded property: tsa.google.mcc.userAgent=AdMaxLocal AdWords Software TSA QA 2019-01-09 13:52:09.137 (3) [main]: Loaded property: tsa.timezone=UTC 2019-01-09 13:52:09.138 (3) [main]: Debug Level set to 5 2019-01-09 13:52:09.138 (3) [main]: AdMax Summarizer parameters: -d5 -a2502 -T2019-01-07 -z America/New_York 2019-01-09 13:52:09.223 (4) [main]: registered MBean [thesearchagency.db.util:type=MultiplexDatabasePool4142bc58-9af2-401d-84f4-a767332157f2] for class com.thesearchagency.db.util.MultiplexDatabasePool 2019-01-09 13:52:09.231 (5) [main]: ProfilerRec.start("AdMaxSummarizer",false) in Thread [main] 2019-01-09 13:52:09.234 (4) [main]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.463 (3) [main]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.464 (4) [main]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.466 (5) [main]: ProfilerRec.start("db",true) in Thread [main] 2019-01-09 13:52:09.466 (4) [main]: spike: SQL->[select * from `tsacommon`.`databaseInstances` where ((`name`="global-logging") and (`type`="mysql")) limit 1] 2019-01-09 13:52:09.469 (5) [main]: recordAsSubs("db", 3, "main") 2019-01-09 13:52:09.470 (5) [main]: addSub("db") 2019-01-09 13:52:09.470 (5) [main]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:09.470 (4) [main]: time=0.003 2019-01-09 13:52:09.471 (4) [main]: Query executed in 0.003s 2019-01-09 13:52:09.472 (5) [main]: loading 8 columns from result set: 2019-01-09 13:52:09.472 (5) [main]: loaded column: databaseInstances.id->4 2019-01-09 13:52:09.472 (5) [main]: loaded column: databaseInstances.name->global-logging 2019-01-09 13:52:09.472 (5) [main]: loaded column: databaseInstances.description->Global Logging Tables 2019-01-09 13:52:09.472 (5) [main]: loaded column: databaseInstances.type->mysql 2019-01-09 13:52:09.472 (5) [main]: loaded column: databaseInstances.database->logging 2019-01-09 13:52:09.473 (5) [main]: loaded column: databaseInstances.host->logging-01-write 2019-01-09 13:52:09.473 (5) [main]: loaded column: databaseInstances.port->3306 2019-01-09 13:52:09.473 (5) [main]: loaded column: databaseInstances.access->rw 2019-01-09 13:52:09.473 (4) [main]: Record Loaded 2019-01-09 13:52:09.473 (4) [main]: DataCache: put key global-logging 2019-01-09 13:52:09.474 (4) [main]: closed [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.476 (4) [main]: registered MBean [com.carldunham.jst.pooling:type=ObjectPoolf9cfda3a-d2ff-47f9-834d-a82d79b5327b] for class com.carldunham.jst.pooling.ObjectPool 2019-01-09 13:52:09.477 (3) [main]: DatabasePool with a limit of 18 created 2019-01-09 13:52:09.477 (4) [main]: registered MBean [com.carldunham.jst.db:type=DatabasePoola5ef9f8a-d25f-4e88-9734-75f89f12fb5c] for class com.carldunham.jst.db.DatabasePool 2019-01-09 13:52:09.480 (4) [ConnectionCloser]: Starting DatabasePool ConnectionCloserThread 2019-01-09 13:52:09.491 (3) [main]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.491 (4) [main]: opened [jdbc:mysql://logging-01-write:3306/logging?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.491 (4) [main]: Created new Object 2019-01-09 13:52:09.491 (5) [main]: ProfilerRec.start("db",true) in Thread [main] 2019-01-09 13:52:09.491 (4) [main]: spike: SQL->[select @@version] 2019-01-09 13:52:09.494 (5) [main]: recordAsSubs("db", 3, "main") 2019-01-09 13:52:09.494 (5) [main]: addSub("db") 2019-01-09 13:52:09.494 (5) [main]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:09.494 (4) [main]: time=0.003 2019-01-09 13:52:09.494 (4) [main]: Query executed in 0.003s 2019-01-09 13:52:09.494 (4) [main]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:09.495 (4) [main]: preparing SQL statement [insert into logging.processHistory (processID, startTime) values (?,?)], genflags=yes 2019-01-09 13:52:09.508 (3) [main]: Could not save process history start 2019-01-09 13:52:09.508 (4) [main]: Returning Object to Pool 2019-01-09 13:52:09.508 (3) [main]: =============== Starting AdMax Summarizer ================ 2019-01-09 13:52:09.509 (4) [main]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.509 (3) [main]: ----------------AdMax Summarizer starting 2019-01-09 13:52:09.523 (3) [main]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.523 (4) [main]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.523 (5) [main]: ProfilerRec.start("db",true) in Thread [main] 2019-01-09 13:52:09.523 (4) [main]: spike: SQL->[select * from `tsacommon`.`accounts` where ((`tsacommon`.`accounts`.`id` in (2502)) and (`tsacommon`.`accounts`.`timeZoneID` in ("America/New_York"))) order by rand()] 2019-01-09 13:52:09.525 (5) [main]: recordAsSubs("db", 2, "main") 2019-01-09 13:52:09.525 (5) [main]: addSub("db") 2019-01-09 13:52:09.525 (5) [main]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:09.525 (4) [main]: time=0.002 2019-01-09 13:52:09.525 (4) [main]: Query executed in 0.002s 2019-01-09 13:52:09.525 (5) [main]: loading 21 columns from result set: 2019-01-09 13:52:09.526 (5) [main]: loaded column: accounts.id->2502 2019-01-09 13:52:09.526 (5) [main]: loaded column: accounts.description->eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.tier->1 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.priority->50 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.reportClicksOut->false 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.reportConversions->false 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.conversionTypes->paid 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.isBidManaged->true 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.isActive->true 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.perfMetric->cpa 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.providesConversionData->true 2019-01-09 13:52:09.527 (5) [main]: loaded column: accounts.addConversionData->true 2019-01-09 13:52:09.528 (5) [main]: loaded column: accounts.fractionalConversionCounts->false 2019-01-09 13:52:09.528 (5) [main]: loaded column: accounts.hideNullSources->false 2019-01-09 13:52:09.528 (5) [main]: loaded column: accounts.nowTime->null 2019-01-09 13:52:09.530 (5) [main]: skipping invalid column: accounts.warehouseDatabase 2019-01-09 13:52:09.531 (5) [main]: loaded column: accounts.databaseInstance->1 2019-01-09 13:52:09.531 (5) [main]: loaded column: accounts.warehouseInstance->1 2019-01-09 13:52:09.531 (5) [main]: loaded column: accounts.isOnline->true 2019-01-09 13:52:09.531 (5) [main]: loaded column: accounts.currencyCode->USD 2019-01-09 13:52:09.531 (5) [main]: loaded column: accounts.timeZoneID->America/New_York 2019-01-09 13:52:09.537 (3) [P1T1]: Starting Account ID #2502: eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:09.537 (4) [P1T1]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.550 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.550 (4) [P1T1]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.550 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:09.550 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.550 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`accounts` where (`id`=2502) limit 1] 2019-01-09 13:52:09.556 (5) [P1T1]: recordAsSubs("db", 6, "P1T1") 2019-01-09 13:52:09.556 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.556 (5) [P1T1]: ProfilerRec.end("db", true): ret=6 2019-01-09 13:52:09.557 (4) [P1T1]: time=0.006 2019-01-09 13:52:09.557 (4) [P1T1]: Query executed in 0.006s 2019-01-09 13:52:09.557 (5) [P1T1]: loading 21 columns from result set: 2019-01-09 13:52:09.557 (5) [P1T1]: loaded column: accounts.id->2502 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.description->eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.tier->1 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.priority->50 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.reportClicksOut->false 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.reportConversions->false 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.conversionTypes->paid 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.isBidManaged->true 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.isActive->true 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.perfMetric->cpa 2019-01-09 13:52:09.558 (5) [P1T1]: loaded column: accounts.providesConversionData->true 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.addConversionData->true 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.fractionalConversionCounts->false 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.hideNullSources->false 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.nowTime->null 2019-01-09 13:52:09.559 (5) [P1T1]: skipping invalid column: accounts.warehouseDatabase 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.databaseInstance->1 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.warehouseInstance->1 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.isOnline->true 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.currencyCode->USD 2019-01-09 13:52:09.559 (5) [P1T1]: loaded column: accounts.timeZoneID->America/New_York 2019-01-09 13:52:09.559 (4) [P1T1]: Record Loaded 2019-01-09 13:52:09.559 (4) [P1T1]: DataCache: put key acct-st-tracker 2502 2019-01-09 13:52:09.560 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.560 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`databaseInstances` where ((`name`="acct-st-tracker-1") and (`type`="mysql")) limit 1] 2019-01-09 13:52:09.563 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:09.563 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.563 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:09.563 (4) [P1T1]: time=0.003 2019-01-09 13:52:09.563 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:09.563 (5) [P1T1]: loading 8 columns from result set: 2019-01-09 13:52:09.563 (5) [P1T1]: loaded column: databaseInstances.id->15 2019-01-09 13:52:09.563 (5) [P1T1]: loaded column: databaseInstances.name->acct-st-tracker-1 2019-01-09 13:52:09.563 (5) [P1T1]: loaded column: databaseInstances.description->Account-specific Catch-all database 2019-01-09 13:52:09.564 (5) [P1T1]: loaded column: databaseInstances.type->mysql 2019-01-09 13:52:09.564 (5) [P1T1]: loaded column: databaseInstances.database->st-tracker 2019-01-09 13:52:09.564 (5) [P1T1]: loaded column: databaseInstances.host->acctdb-01-write 2019-01-09 13:52:09.564 (5) [P1T1]: loaded column: databaseInstances.port->3306 2019-01-09 13:52:09.564 (5) [P1T1]: loaded column: databaseInstances.access->rw 2019-01-09 13:52:09.564 (4) [P1T1]: Record Loaded 2019-01-09 13:52:09.564 (4) [P1T1]: DataCache: put key acct-st-tracker-1 2019-01-09 13:52:09.564 (4) [P1T1]: closed [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.565 (4) [P1T1]: registered MBean [com.carldunham.jst.pooling:type=ObjectPoolb9d46c74-7c92-4986-9fed-b1856f789d97] for class com.carldunham.jst.pooling.ObjectPool 2019-01-09 13:52:09.565 (3) [P1T1]: DatabasePool with a limit of 18 created 2019-01-09 13:52:09.565 (4) [P1T1]: registered MBean [com.carldunham.jst.db:type=DatabasePool57dc7418-0dd1-48ca-a26f-d35b67f647a1] for class com.carldunham.jst.db.DatabasePool 2019-01-09 13:52:09.574 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.574 (4) [P1T1]: opened [jdbc:mysql://acctdb-01-write:3306/st-tracker?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.574 (4) [P1T1]: Created new Object 2019-01-09 13:52:09.574 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.574 (4) [P1T1]: spike: SQL->[select @@version] 2019-01-09 13:52:09.575 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:09.575 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.575 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:09.575 (4) [P1T1]: time=0.001 2019-01-09 13:52:09.575 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:09.575 (4) [P1T1]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:09.575 (4) [P1T1]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.575 (4) [P1T1]: registered MBean [com.carldunham.jst.pooling:type=ObjectPool4d0bd70d-8c87-4549-bb92-be166060e612] for class com.carldunham.jst.pooling.ObjectPool 2019-01-09 13:52:09.576 (3) [P1T1]: DatabasePool with a limit of 18 created 2019-01-09 13:52:09.576 (4) [P1T1]: registered MBean [com.carldunham.jst.db:type=DatabasePoolcf3649b0-477d-4727-88f8-14665a57ebad] for class com.carldunham.jst.db.DatabasePool 2019-01-09 13:52:09.588 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.588 (4) [P1T1]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.588 (4) [P1T1]: Created new Object 2019-01-09 13:52:09.588 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.588 (4) [P1T1]: spike: SQL->[select @@version] 2019-01-09 13:52:09.590 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:09.590 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.590 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:09.590 (4) [P1T1]: time=0.002 2019-01-09 13:52:09.590 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:09.590 (4) [P1T1]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:09.590 (4) [P1T1]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.598 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.598 (4) [P1T1]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.599 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.599 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`databaseInstances` where ((`name`="global-staging") and (`type`="mysql")) limit 1] 2019-01-09 13:52:09.601 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:09.601 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.601 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:09.601 (4) [P1T1]: time=0.001 2019-01-09 13:52:09.601 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:09.601 (5) [P1T1]: loading 8 columns from result set: 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.id->3 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.name->global-staging 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.description->Global Staging Tables 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.type->mysql 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.database->staging 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.host->staging-01-write 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.port->3306 2019-01-09 13:52:09.601 (5) [P1T1]: loaded column: databaseInstances.access->rw 2019-01-09 13:52:09.601 (4) [P1T1]: Record Loaded 2019-01-09 13:52:09.601 (4) [P1T1]: DataCache: put key global-staging 2019-01-09 13:52:09.602 (4) [P1T1]: closed [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.602 (4) [P1T1]: registered MBean [com.carldunham.jst.pooling:type=ObjectPoolceb4a3cf-32a9-424f-85ed-fca2a75105dc] for class com.carldunham.jst.pooling.ObjectPool 2019-01-09 13:52:09.602 (3) [P1T1]: DatabasePool with a limit of 18 created 2019-01-09 13:52:09.602 (4) [P1T1]: registered MBean [com.carldunham.jst.db:type=DatabasePool425013f7-d323-4f98-aab0-87f1c4b652fc] for class com.carldunham.jst.db.DatabasePool 2019-01-09 13:52:09.612 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.612 (4) [P1T1]: opened [jdbc:mysql://staging-01-write:3306/staging?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.612 (4) [P1T1]: Created new Object 2019-01-09 13:52:09.612 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.612 (4) [P1T1]: spike: SQL->[select @@version] 2019-01-09 13:52:09.613 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:09.613 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.613 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:09.613 (4) [P1T1]: time=0.001 2019-01-09 13:52:09.613 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:09.614 (4) [P1T1]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:09.614 (4) [P1T1]: no account, or instance for databaseInstance, skipping lookup... 2019-01-09 13:52:09.622 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.622 (4) [P1T1]: opened [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.622 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.622 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`databaseInstances` where ((`name`="global-clients") and (`type`="mysql")) limit 1] 2019-01-09 13:52:09.624 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:09.624 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.624 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:09.624 (4) [P1T1]: time=0.002 2019-01-09 13:52:09.624 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:09.624 (5) [P1T1]: loading 8 columns from result set: 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.id->13 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.name->global-clients 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.description->Global Database for client-specific tables 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.type->mysql 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.database->clients 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.host->clients-01-write 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.port->3306 2019-01-09 13:52:09.624 (5) [P1T1]: loaded column: databaseInstances.access->rw 2019-01-09 13:52:09.624 (4) [P1T1]: Record Loaded 2019-01-09 13:52:09.624 (4) [P1T1]: DataCache: put key global-clients 2019-01-09 13:52:09.625 (4) [P1T1]: closed [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.625 (4) [P1T1]: registered MBean [com.carldunham.jst.pooling:type=ObjectPool7a4bc307-e32f-45d8-a5cf-4ccd052d51fe] for class com.carldunham.jst.pooling.ObjectPool 2019-01-09 13:52:09.625 (3) [P1T1]: DatabasePool with a limit of 18 created 2019-01-09 13:52:09.625 (4) [P1T1]: registered MBean [com.carldunham.jst.db:type=DatabasePool67b841c7-62f5-4684-b102-bab07df32706] for class com.carldunham.jst.db.DatabasePool 2019-01-09 13:52:09.633 (3) [P1T1]: JDBC Driver Version: 5.1 (com.mysql.jdbc.Driver) 2019-01-09 13:52:09.633 (4) [P1T1]: opened [jdbc:mysql://clients-01-write:3306/clients?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:09.633 (4) [P1T1]: Created new Object 2019-01-09 13:52:09.633 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.633 (4) [P1T1]: spike: SQL->[select @@version] 2019-01-09 13:52:09.634 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:09.634 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.634 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:09.634 (4) [P1T1]: time=0.001 2019-01-09 13:52:09.634 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:09.634 (4) [P1T1]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:09.635 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.635 (4) [P1T1]: spike: SQL->[SELECT GET_LOCK('com.thesearchagency.admax.common.AdMaxCampaignMigrator.2502', 600)] 2019-01-09 13:52:09.636 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:09.637 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.637 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:09.637 (4) [P1T1]: time=0.001 2019-01-09 13:52:09.637 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:09.637 (3) [P1T1]: Got migrator lock for Account ID #2502 2019-01-09 13:52:09.638 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:09.642 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.643 (4) [P1T1]: spike: SQL->[select * from `settings` where ((((`settings`.`scope`="accounts")) and ((`settings`.`scopeId`=2502))))] 2019-01-09 13:52:09.646 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:09.646 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.647 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:09.647 (4) [P1T1]: time=0.003 2019-01-09 13:52:09.647 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:09.647 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:09.647 (4) [P1T1]: spike: SQL->[select * from `settingDefaults` where (`settingDefaults`.`scope`="accounts")] 2019-01-09 13:52:09.649 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:09.649 (5) [P1T1]: addSub("db") 2019-01-09 13:52:09.649 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:09.649 (4) [P1T1]: time=0.002 2019-01-09 13:52:09.649 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:09.649 (4) [P1T1]: DataCache: put key accounts 2502 2019-01-09 13:52:10.053 (4) [P1T1]: DataCache: Found key accounts 2502 2019-01-09 13:52:10.053 (4) [P1T1]: Settings.getSettings: loaded from cache (key=accounts 2502) 2019-01-09 13:52:10.053 (3) [P1T1]: Applying algorithm [SABB] for Account ID #2502 in distributions [3, 178, 147] 2019-01-09 13:52:10.056 (3) [P1T1]: Processing data for 2019-01-07 00:00:00 EST 2019-01-09 13:52:10.060 (5) [P1T1]: quoteValue(), nothing special about it=>[3], type=class java.lang.Integer 2019-01-09 13:52:10.060 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.060 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`searchEngineAccounts` inner join `tsacommon`.`searchEngineUsers` on (((`tsacommon`.`searchEngineAccounts`.`searchEngineUserID`=`tsacommon`.`searchEngineUsers`.`id`))) inner join `tsacommon`.`accounts` on ((`tsacommon`.`accounts`.`id`=`tsacommon`.`searchEngineAccounts`.`accountID`)) where (((((`tsacommon`.`searchEngineAccounts`.`enabled`="true") and (((`tsacommon`.`searchEngineAccounts`.`searchEngineStatus`="ok") or (`tsacommon`.`searchEngineAccounts`.`searchEngineStatus`="disabled") or (`tsacommon`.`searchEngineAccounts`.`searchEngineStatus`="deleted"))) and (`tsacommon`.`searchEngineAccounts`.`accountID`="2502"))) and ((`tsacommon`.`searchEngineAccounts`.`distributionID` in (3))))) order by rand()] 2019-01-09 13:52:10.062 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.062 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.062 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.062 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.062 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.065 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.066 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.066 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14796") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.067 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.067 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.067 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.067 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.067 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.067 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.067 (5) [P1T1]: loaded column: dataSources.id->11863 2019-01-09 13:52:10.067 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.067 (5) [P1T1]: loaded column: dataSources.typeID->14796 2019-01-09 13:52:10.068 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.068 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.068 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.068 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.068 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15094") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.070 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.070 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.070 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.070 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.070 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.070 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.070 (5) [P1T1]: loaded column: dataSources.id->11947 2019-01-09 13:52:10.070 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.070 (5) [P1T1]: loaded column: dataSources.typeID->15094 2019-01-09 13:52:10.070 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.070 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.071 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.071 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.071 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14797") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.072 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.072 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.072 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.072 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.072 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.072 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.072 (5) [P1T1]: loaded column: dataSources.id->11867 2019-01-09 13:52:10.072 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.072 (5) [P1T1]: loaded column: dataSources.typeID->14797 2019-01-09 13:52:10.072 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.072 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.073 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.073 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.073 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14799") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.074 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.074 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.074 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.074 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.074 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.074 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.074 (5) [P1T1]: loaded column: dataSources.id->11869 2019-01-09 13:52:10.074 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.074 (5) [P1T1]: loaded column: dataSources.typeID->14799 2019-01-09 13:52:10.074 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.074 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.074 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.075 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.075 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15095") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.080 (5) [P1T1]: recordAsSubs("db", 5, "P1T1") 2019-01-09 13:52:10.080 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.080 (5) [P1T1]: ProfilerRec.end("db", true): ret=5 2019-01-09 13:52:10.080 (4) [P1T1]: time=0.005 2019-01-09 13:52:10.080 (4) [P1T1]: Query executed in 0.005s 2019-01-09 13:52:10.080 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.080 (5) [P1T1]: loaded column: dataSources.id->11942 2019-01-09 13:52:10.080 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.080 (5) [P1T1]: loaded column: dataSources.typeID->15095 2019-01-09 13:52:10.080 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.080 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.080 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.080 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.081 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15098") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.082 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.082 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.082 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.082 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.082 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.082 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.082 (5) [P1T1]: loaded column: dataSources.id->11946 2019-01-09 13:52:10.082 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.082 (5) [P1T1]: loaded column: dataSources.typeID->15098 2019-01-09 13:52:10.082 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.082 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.082 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.082 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.082 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14798") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.084 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.084 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.084 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.084 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.084 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.084 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.084 (5) [P1T1]: loaded column: dataSources.id->11870 2019-01-09 13:52:10.084 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.084 (5) [P1T1]: loaded column: dataSources.typeID->14798 2019-01-09 13:52:10.084 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.084 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.084 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.084 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.084 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15097") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.085 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.086 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.086 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.086 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.086 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.086 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.086 (5) [P1T1]: loaded column: dataSources.id->11945 2019-01-09 13:52:10.086 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.086 (5) [P1T1]: loaded column: dataSources.typeID->15097 2019-01-09 13:52:10.086 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.086 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.086 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.086 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.086 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15093") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.088 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.088 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.088 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.088 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.089 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.089 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.089 (5) [P1T1]: loaded column: dataSources.id->11948 2019-01-09 13:52:10.089 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.089 (5) [P1T1]: loaded column: dataSources.typeID->15093 2019-01-09 13:52:10.089 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.089 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.089 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.089 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.089 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15096") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.090 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.090 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.090 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.090 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.090 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.090 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.090 (5) [P1T1]: loaded column: dataSources.id->11943 2019-01-09 13:52:10.091 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.091 (5) [P1T1]: loaded column: dataSources.typeID->15096 2019-01-09 13:52:10.091 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.091 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.091 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.091 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.091 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14802") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.092 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.092 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.092 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.092 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.092 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.092 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.092 (5) [P1T1]: loaded column: dataSources.id->11864 2019-01-09 13:52:10.092 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.092 (5) [P1T1]: loaded column: dataSources.typeID->14802 2019-01-09 13:52:10.092 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.092 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.093 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.093 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.093 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14800") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.094 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.094 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.094 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.094 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.094 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.094 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.094 (5) [P1T1]: loaded column: dataSources.id->11868 2019-01-09 13:52:10.094 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.094 (5) [P1T1]: loaded column: dataSources.typeID->14800 2019-01-09 13:52:10.094 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.094 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.094 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.094 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.094 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15099") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.096 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.096 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.096 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.096 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.096 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.096 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.096 (5) [P1T1]: loaded column: dataSources.id->11944 2019-01-09 13:52:10.096 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.096 (5) [P1T1]: loaded column: dataSources.typeID->15099 2019-01-09 13:52:10.096 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.096 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.096 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.096 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.096 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14795") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.097 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.098 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.098 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.098 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.098 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.098 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.098 (5) [P1T1]: loaded column: dataSources.id->11866 2019-01-09 13:52:10.098 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.098 (5) [P1T1]: loaded column: dataSources.typeID->14795 2019-01-09 13:52:10.098 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.098 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.098 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.098 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.098 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="14801") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.099 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.099 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.099 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.099 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.099 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.099 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.099 (5) [P1T1]: loaded column: dataSources.id->11865 2019-01-09 13:52:10.099 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.099 (5) [P1T1]: loaded column: dataSources.typeID->14801 2019-01-09 13:52:10.100 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.100 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.100 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.100 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.100 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`accountID`=2502) and (`typeID`="15100") and (`type`="admax")) limit 1] 2019-01-09 13:52:10.107 (5) [P1T1]: recordAsSubs("db", 7, "P1T1") 2019-01-09 13:52:10.107 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.107 (5) [P1T1]: ProfilerRec.end("db", true): ret=7 2019-01-09 13:52:10.107 (4) [P1T1]: time=0.007 2019-01-09 13:52:10.107 (4) [P1T1]: Query executed in 0.007s 2019-01-09 13:52:10.107 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.107 (5) [P1T1]: loaded column: dataSources.id->11941 2019-01-09 13:52:10.107 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.107 (5) [P1T1]: loaded column: dataSources.typeID->15100 2019-01-09 13:52:10.107 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.108 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.108 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.108 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.108 (4) [P1T1]: spike: SQL->[select * from `dataSources` where ((`dataSources`.`type`="admax") and (`dataSources`.`accountID`=2502))] 2019-01-09 13:52:10.111 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.111 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.111 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.111 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.111 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:10.111 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.111 (5) [P1T1]: loaded column: dataSources.id->11863 2019-01-09 13:52:10.111 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.111 (5) [P1T1]: loaded column: dataSources.typeID->14796 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11864 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14802 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11865 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14801 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11866 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14795 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11867 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14797 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11868 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14800 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11869 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14799 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.112 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.id->11870 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.112 (5) [P1T1]: loaded column: dataSources.typeID->14798 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11941 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15100 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11942 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15095 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11943 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15096 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11944 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15099 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11945 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15097 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11946 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15098 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.id->11947 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.typeID->15094 2019-01-09 13:52:10.113 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.113 (5) [P1T1]: loading 4 columns from result set: 2019-01-09 13:52:10.114 (5) [P1T1]: loaded column: dataSources.id->11948 2019-01-09 13:52:10.114 (5) [P1T1]: loaded column: dataSources.type->admax 2019-01-09 13:52:10.114 (5) [P1T1]: loaded column: dataSources.typeID->15093 2019-01-09 13:52:10.114 (5) [P1T1]: loaded column: dataSources.accountID->2502 2019-01-09 13:52:10.115 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.115 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11863")) limit 1] 2019-01-09 13:52:10.118 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.118 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.118 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.118 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.118 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:10.118 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.118 (5) [P1T1]: loaded column: dataAvailability.id->73604 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11863 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.119 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.119 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.119 (5) [P1T1]: quoteValue(), nothing special about it=>[11863], type=class java.lang.Integer 2019-01-09 13:52:10.119 (5) [P1T1]: quoteValue(), nothing special about it=>[73604], type=class java.lang.Integer 2019-01-09 13:52:10.120 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.120 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11863,null,73604,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.121 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.121 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.121 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.121 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.121 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.123 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.123 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11863","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.124 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.124 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.124 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.124 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.124 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.125 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.125 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11864")) limit 1] 2019-01-09 13:52:10.126 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.126 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.126 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.126 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.126 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.126 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.id->73605 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11864 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.127 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.127 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.127 (5) [P1T1]: quoteValue(), nothing special about it=>[11864], type=class java.lang.Integer 2019-01-09 13:52:10.127 (5) [P1T1]: quoteValue(), nothing special about it=>[73605], type=class java.lang.Integer 2019-01-09 13:52:10.127 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.127 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11864,null,73605,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.129 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.129 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.129 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.129 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.129 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.130 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.130 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11864","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.131 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.131 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.131 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.131 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.131 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.131 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.131 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11865")) limit 1] 2019-01-09 13:52:10.132 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.132 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.132 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.132 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.132 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.132 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.132 (5) [P1T1]: loaded column: dataAvailability.id->73606 2019-01-09 13:52:10.132 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11865 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.133 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.133 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.133 (5) [P1T1]: quoteValue(), nothing special about it=>[11865], type=class java.lang.Integer 2019-01-09 13:52:10.133 (5) [P1T1]: quoteValue(), nothing special about it=>[73606], type=class java.lang.Integer 2019-01-09 13:52:10.133 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.133 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11865,null,73606,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.134 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.134 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.134 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.134 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.134 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.135 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.135 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11865","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.136 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.136 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.136 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.136 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.136 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.136 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.136 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11866")) limit 1] 2019-01-09 13:52:10.138 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.138 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.138 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.138 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.138 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.138 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.138 (5) [P1T1]: loaded column: dataAvailability.id->73607 2019-01-09 13:52:10.138 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.138 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11866 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.139 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.139 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.139 (5) [P1T1]: quoteValue(), nothing special about it=>[11866], type=class java.lang.Integer 2019-01-09 13:52:10.139 (5) [P1T1]: quoteValue(), nothing special about it=>[73607], type=class java.lang.Integer 2019-01-09 13:52:10.139 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.139 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11866,null,73607,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.140 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.140 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.140 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.140 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.140 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.140 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.140 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11866","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.142 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.142 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.142 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.142 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.142 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.142 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.142 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11867")) limit 1] 2019-01-09 13:52:10.143 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.143 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.143 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.143 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.143 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.143 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.id->73608 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11867 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.143 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.144 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.144 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.144 (5) [P1T1]: quoteValue(), nothing special about it=>[11867], type=class java.lang.Integer 2019-01-09 13:52:10.144 (5) [P1T1]: quoteValue(), nothing special about it=>[73608], type=class java.lang.Integer 2019-01-09 13:52:10.144 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.144 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11867,null,73608,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.145 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.145 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.145 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.145 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.145 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.145 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.145 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11867","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.147 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.148 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.148 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.148 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.148 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.148 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.148 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11868")) limit 1] 2019-01-09 13:52:10.149 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.150 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.150 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.150 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.150 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.150 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.id->73609 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11868 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.150 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.150 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.150 (5) [P1T1]: quoteValue(), nothing special about it=>[11868], type=class java.lang.Integer 2019-01-09 13:52:10.150 (5) [P1T1]: quoteValue(), nothing special about it=>[73609], type=class java.lang.Integer 2019-01-09 13:52:10.150 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.150 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11868,null,73609,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.154 (5) [P1T1]: recordAsSubs("db", 4, "P1T1") 2019-01-09 13:52:10.154 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.154 (5) [P1T1]: ProfilerRec.end("db", true): ret=4 2019-01-09 13:52:10.154 (4) [P1T1]: time=0.004 2019-01-09 13:52:10.154 (4) [P1T1]: Query executed in 0.004s, 2 row(s) affected 2019-01-09 13:52:10.155 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.155 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11868","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.157 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.157 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.157 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.157 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.157 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.157 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.157 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11869")) limit 1] 2019-01-09 13:52:10.159 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.159 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.159 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.159 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.159 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.159 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.id->73610 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11869 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.159 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.159 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.159 (5) [P1T1]: quoteValue(), nothing special about it=>[11869], type=class java.lang.Integer 2019-01-09 13:52:10.159 (5) [P1T1]: quoteValue(), nothing special about it=>[73610], type=class java.lang.Integer 2019-01-09 13:52:10.159 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.159 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11869,null,73610,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.161 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.161 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.161 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.161 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.161 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.161 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.161 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11869","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.162 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.162 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.162 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.162 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.162 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.162 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.162 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11870")) limit 1] 2019-01-09 13:52:10.165 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.165 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.165 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.165 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.165 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:10.165 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.id->73611 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11870 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.165 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.165 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.165 (5) [P1T1]: quoteValue(), nothing special about it=>[11870], type=class java.lang.Integer 2019-01-09 13:52:10.165 (5) [P1T1]: quoteValue(), nothing special about it=>[73611], type=class java.lang.Integer 2019-01-09 13:52:10.165 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.165 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11870,null,73611,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.167 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.167 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.167 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.167 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.167 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.168 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.168 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11870","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.169 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.169 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.169 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.169 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.169 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.169 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.169 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11941")) limit 1] 2019-01-09 13:52:10.170 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.171 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.171 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.171 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.171 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.171 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.id->73850 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11941 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.171 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.171 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.171 (5) [P1T1]: quoteValue(), nothing special about it=>[11941], type=class java.lang.Integer 2019-01-09 13:52:10.171 (5) [P1T1]: quoteValue(), nothing special about it=>[73850], type=class java.lang.Integer 2019-01-09 13:52:10.171 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.171 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11941,null,73850,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.174 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.174 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.174 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.174 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.174 (4) [P1T1]: Query executed in 0.003s, 2 row(s) affected 2019-01-09 13:52:10.175 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.175 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11941","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.177 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.177 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.177 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.177 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.177 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.177 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.177 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11942")) limit 1] 2019-01-09 13:52:10.180 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.180 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.180 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.180 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.180 (4) [P1T1]: Query executed in 0.003s 2019-01-09 13:52:10.180 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.180 (5) [P1T1]: loaded column: dataAvailability.id->73851 2019-01-09 13:52:10.180 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.180 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.180 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.180 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11942 2019-01-09 13:52:10.181 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.181 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.181 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.181 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.181 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.181 (5) [P1T1]: quoteValue(), nothing special about it=>[11942], type=class java.lang.Integer 2019-01-09 13:52:10.181 (5) [P1T1]: quoteValue(), nothing special about it=>[73851], type=class java.lang.Integer 2019-01-09 13:52:10.181 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.181 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11942,null,73851,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.182 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.182 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.182 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.182 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.182 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.182 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.182 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11942","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.184 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.184 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.184 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.184 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.184 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.184 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.184 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11943")) limit 1] 2019-01-09 13:52:10.186 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.186 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.186 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.186 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.186 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.186 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.id->73852 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11943 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.186 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.186 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.186 (5) [P1T1]: quoteValue(), nothing special about it=>[11943], type=class java.lang.Integer 2019-01-09 13:52:10.186 (5) [P1T1]: quoteValue(), nothing special about it=>[73852], type=class java.lang.Integer 2019-01-09 13:52:10.186 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.186 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11943,null,73852,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.187 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.187 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.187 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.187 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.187 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.188 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.188 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11943","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.189 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.189 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.189 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.189 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.189 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.189 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.189 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11944")) limit 1] 2019-01-09 13:52:10.193 (5) [P1T1]: recordAsSubs("db", 4, "P1T1") 2019-01-09 13:52:10.193 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.193 (5) [P1T1]: ProfilerRec.end("db", true): ret=4 2019-01-09 13:52:10.193 (4) [P1T1]: time=0.004 2019-01-09 13:52:10.193 (4) [P1T1]: Query executed in 0.004s 2019-01-09 13:52:10.193 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.id->73853 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11944 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.193 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.194 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.194 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.194 (5) [P1T1]: quoteValue(), nothing special about it=>[11944], type=class java.lang.Integer 2019-01-09 13:52:10.194 (5) [P1T1]: quoteValue(), nothing special about it=>[73853], type=class java.lang.Integer 2019-01-09 13:52:10.194 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.194 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11944,null,73853,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.198 (5) [P1T1]: recordAsSubs("db", 4, "P1T1") 2019-01-09 13:52:10.198 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.198 (5) [P1T1]: ProfilerRec.end("db", true): ret=4 2019-01-09 13:52:10.198 (4) [P1T1]: time=0.004 2019-01-09 13:52:10.198 (4) [P1T1]: Query executed in 0.004s, 2 row(s) affected 2019-01-09 13:52:10.198 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.198 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11944","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.199 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.199 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.199 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.199 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.199 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.200 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.200 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11945")) limit 1] 2019-01-09 13:52:10.202 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.202 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.202 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.202 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.202 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.202 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.202 (5) [P1T1]: loaded column: dataAvailability.id->73854 2019-01-09 13:52:10.202 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.202 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.202 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.202 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11945 2019-01-09 13:52:10.203 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.203 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.203 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.203 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.203 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.203 (5) [P1T1]: quoteValue(), nothing special about it=>[11945], type=class java.lang.Integer 2019-01-09 13:52:10.203 (5) [P1T1]: quoteValue(), nothing special about it=>[73854], type=class java.lang.Integer 2019-01-09 13:52:10.203 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.203 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11945,null,73854,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.204 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.204 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.204 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.204 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.204 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.204 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.204 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11945","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.209 (5) [P1T1]: recordAsSubs("db", 5, "P1T1") 2019-01-09 13:52:10.209 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.209 (5) [P1T1]: ProfilerRec.end("db", true): ret=5 2019-01-09 13:52:10.209 (4) [P1T1]: time=0.005 2019-01-09 13:52:10.209 (4) [P1T1]: Query executed in 0.005s, 1 row(s) affected 2019-01-09 13:52:10.209 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.210 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11946")) limit 1] 2019-01-09 13:52:10.211 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.211 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.211 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.211 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.211 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.211 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.id->73855 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:43.0 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11946 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.211 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.211 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.211 (5) [P1T1]: quoteValue(), nothing special about it=>[11946], type=class java.lang.Integer 2019-01-09 13:52:10.211 (5) [P1T1]: quoteValue(), nothing special about it=>[73855], type=class java.lang.Integer 2019-01-09 13:52:10.211 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.211 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11946,null,73855,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.219 (5) [P1T1]: recordAsSubs("db", 8, "P1T1") 2019-01-09 13:52:10.219 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.219 (5) [P1T1]: ProfilerRec.end("db", true): ret=8 2019-01-09 13:52:10.219 (4) [P1T1]: time=0.008 2019-01-09 13:52:10.219 (4) [P1T1]: Query executed in 0.008s, 2 row(s) affected 2019-01-09 13:52:10.219 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.220 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11946","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.222 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.222 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.222 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.222 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.222 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.223 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.223 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11947")) limit 1] 2019-01-09 13:52:10.224 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.224 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.224 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.224 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.224 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.224 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.id->73856 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:44.0 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11947 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.224 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.224 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.224 (5) [P1T1]: quoteValue(), nothing special about it=>[11947], type=class java.lang.Integer 2019-01-09 13:52:10.224 (5) [P1T1]: quoteValue(), nothing special about it=>[73856], type=class java.lang.Integer 2019-01-09 13:52:10.225 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.225 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11947,null,73856,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.227 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.227 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.227 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.227 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.227 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.227 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.227 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11947","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.228 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.228 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.228 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.228 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.228 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.228 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.228 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11948")) limit 1] 2019-01-09 13:52:10.229 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.229 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.229 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.230 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.230 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.230 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.id->73857 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 12:45:44.0 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11948 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.status->success 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.230 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.292 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.292 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.292 (5) [P1T1]: quoteValue(), nothing special about it=>[11948], type=class java.lang.Integer 2019-01-09 13:52:10.292 (5) [P1T1]: quoteValue(), nothing special about it=>[73857], type=class java.lang.Integer 2019-01-09 13:52:10.292 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.292 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11948,null,73857,"false","2019-01-09 13:52:10","00:00:00","in progress","false")], genflags=yes 2019-01-09 13:52:10.294 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.294 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.294 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.294 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.294 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.294 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.294 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11948","false","2019-01-09 13:52:10","in progress")], genflags=yes 2019-01-09 13:52:10.296 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.296 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.296 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.296 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.296 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.301 (4) [P1T1]: preparing SQL statement [select admaxNominalMonthlyBudgets.ID,admaxCampaignID,startDate,endDate,budget from `st-tracker`.admaxCampaigns join `st-tracker`.admaxNominalMonthlyBudgets on admaxCampaigns.ID=admaxNominalMonthlyBudgets.admaxCampaignID where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.305 (4) [P1T1]: preparing SQL statement [select ID,accountID,admaxCampaignID,searchEngineAccountID,migrationDate,lastAdMaxRunDate from `st-tracker`.admaxCampaignMigrations where accountID=? and migrationDate>=? and lastAdMaxRunDate<=?] 2019-01-09 13:52:10.306 (4) [P1T1]: preparing SQL statement [select admaxNominalMonthlyBudgets.ID,admaxCampaignID,startDate,endDate,budget from `st-tracker`.admaxCampaigns join `st-tracker`.admaxNominalMonthlyBudgets on admaxCampaigns.ID=admaxNominalMonthlyBudgets.admaxCampaignID where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.311 (4) [P1T1]: preparing SQL statement [select ID,accountID,startDate,endDate,anniversaryDate from `st-tracker`.admaxAccountAnniversary where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.313 (4) [P1T1]: preparing SQL statement [select ID,accountID,startDate,endDate,anniversaryDate from `st-tracker`.admaxAccountAnniversary where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.314 (4) [P1T1]: preparing SQL statement [select ID,accountID,startDate,endDate,anniversaryDate from `st-tracker`.admaxAccountAnniversary where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.315 (4) [P1T1]: preparing SQL statement [select ID,accountID,startDate,endDate,anniversaryDate from `st-tracker`.admaxAccountAnniversary where accountID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.320 (4) [P1T1]: preparing SQL statement [select id, admaxCampaignID, startDate, endDate, status from `st-tracker`.admaxCampaignBudgetAccrualStatus where admaxCampaignID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.321 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.322 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7779 2019-01-09 13:52:10.325 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.326 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7780 2019-01-09 13:52:10.326 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.327 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7781 2019-01-09 13:52:10.328 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.328 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7782 2019-01-09 13:52:10.329 (4) [P1T1]: preparing SQL statement [select id, admaxCampaignID, startDate, endDate, status from `st-tracker`.admaxCampaignBudgetAccrualStatus where admaxCampaignID=? and startDate<=? and (endDate is null or endDate>=?)] 2019-01-09 13:52:10.329 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.330 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7779 2019-01-09 13:52:10.333 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.334 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7780 2019-01-09 13:52:10.335 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.336 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7781 2019-01-09 13:52:10.336 (4) [P1T1]: preparing SQL statement [SELECT status FROM `st-tracker`.admaxCampaigns WHERE id = ?] 2019-01-09 13:52:10.337 (3) [P1T1]: Retrieved campaign status Deleted for AdMax campaign 7782 2019-01-09 13:52:10.338 (4) [P1T1]: preparing SQL statement [select aseam.admaxCampaignID, aseam.searchEngineAccountID, sea.distributionID, ac.status from `st-tracker`.admaxCampaigns ac join `st-tracker`.admaxSearchEngineAccountMap aseam on aseam.admaxCampaignID = ac.id join tsacommon.searchEngineAccounts sea on sea.ID = aseam.searchEngineAccountID where ac.accountID = ?] 2019-01-09 13:52:10.339 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.340 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.340 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.341 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.342 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.342 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.343 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.343 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.344 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.345 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.349 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.353 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.354 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.358 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.359 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.359 (4) [P1T1]: preparing SQL statement [select changeTime from logging.searchEngineAccountChanges where `column` = 'searchEngineStatus' AND searchEngineAccountID = ? and newValue = 'deleted'] 2019-01-09 13:52:10.362 (4) [P1T1]: preparing SQL statement [select tsacommon.searchEngineAccounts.ID,staging.googleMccAdMax.date,staging.googleMccAdMax.clicks,staging.googleMccAdMax.cost,staging.googleMccAdMax.averagePosition from tsacommon.searchEngineAccounts left join staging.googleMccAdMax on staging.googleMccAdMax.campaignid=cast(tsacommon.searchEngineAccounts.searchEngineIdentifier as unsigned) where tsacommon.searchEngineAccounts.accountID=? and staging.googleMccAdMax.date=?] 2019-01-09 13:52:10.363 (4) [P1T1]: preparing SQL statement [select tsacommon.searchEngineAccounts.ID,staging.yahooJAdMax.date,staging.yahooJAdMax.clicks,staging.yahooJAdMax.cost,staging.yahooJAdMax.averagePosition from tsacommon.searchEngineAccounts left join staging.yahooJAdMax on tsacommon.searchEngineAccounts.accountID=staging.yahooJAdMax.accountID and staging.yahooJAdMax.campaignid=cast(tsacommon.searchEngineAccounts.searchEngineIdentifier as unsigned) where tsacommon.searchEngineAccounts.accountID=? and staging.yahooJAdMax.date=?] 2019-01-09 13:52:10.365 (4) [P1T1]: preparing SQL statement [select tsacommon.searchEngineAccounts.ID,staging.bingAdMax.date,staging.bingAdMax.clicks,staging.bingAdMax.cost,staging.bingAdMax.averagePosition from tsacommon.searchEngineAccounts left join staging.bingAdMax on staging.bingAdMax.campaignid=cast(tsacommon.searchEngineAccounts.searchEngineIdentifier as unsigned) where tsacommon.searchEngineAccounts.accountID=? and staging.bingAdMax.date=?] 2019-01-09 13:52:10.369 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,type,date,pooledCarryover,dedicatedCarryover,commonCarryover from `st-tracker`.admaxCarryovers where admaxCampaignID=? and type=? and date=?] 2019-01-09 13:52:10.375 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,type,date,budget from `st-tracker`.admaxActualDailyBudgets where admaxCampaignID=? and type=? and date=?] 2019-01-09 13:52:10.382 (4) [P1T1]: preparing SQL statement [select admaxCampaignID,date,budget from `st-tracker`.admaxNominalDailyBudgets where admaxCampaignID=? and date=?] 2019-01-09 13:52:10.387 (4) [P1T1]: preparing SQL statement [select ID,searchEngineAccountID,date,balanceFactor from `st-tracker`.admaxSearchEngineAccountBalanceFactors where searchEngineAccountID=? and date=?] 2019-01-09 13:52:10.400 (4) [P1T1]: preparing SQL statement [select searchEngineAccountID,date,budget,actualSEBudget,budgetStatus from `st-tracker`.admaxSearchEngineAccountActualDailyBudgets where searchEngineAccountID=? and date=?] 2019-01-09 13:52:10.438 (3) [P1T1]: Distribution Properties Cache: Loading new properties for eb11f94b-6db1-4004-997a-53a901b02366_ALGORITHM 2019-01-09 13:52:10,448 [P1T1] INFO o.a.c.h.auth.AuthChallengeProcessor - digest authentication scheme selected 2019-01-09 13:52:10.574 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.574 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.575 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.577 (4) [P1T1]: preparing SQL statement [select distinct admaxNominalMonthlyBudgets.ID,admaxNominalMonthlyBudgets.admaxCampaignID,startDate,endDate,budget from `st-tracker`.admaxCampaigns join `st-tracker`.admaxNominalMonthlyBudgets on admaxCampaigns.ID=admaxNominalMonthlyBudgets.admaxCampaignID join `st-tracker`.admaxSearchEngineAccountMap on admaxCampaigns.ID=admaxSearchEngineAccountMap.admaxCampaignID join tsacommon.searchEngineAccounts on admaxSearchEngineAccountMap.searchEngineAccountID=searchEngineAccounts.ID where admaxCampaigns.accountID=? and startDate<=? and (endDate is null or endDate>=?) and distributionID=?] 2019-01-09 13:52:10.583 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,sloshFactor,budgetType,resellerID from `st-tracker`.admaxSloshFactors where elementType=? and startDate<=? and (endDate is null or endDate>=?) and sloshFactor< 1 and elementID in ()] com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')' at line 1 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) at com.mysql.jdbc.Util.getInstance(Util.java:386) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1054) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4120) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4052) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2503) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2664) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2815) at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2155) at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1379) at com.thesearchagency.admax.dao.AdMaxSloshFactorsDAO.findCampaignsSloshFactorByDate(AdMaxSloshFactorsDAO.java:135) at com.thesearchagency.admax.algorithms.sabb.AdMaxSABBBudgetSlosher.findDefaultCampaignsByDate(AdMaxSABBBudgetSlosher.java:322) at com.thesearchagency.admax.algorithms.sabb.AdMaxSABBAlgorithm.doCarryoverAdjustmentOnBudgetTypeChange(AdMaxSABBAlgorithm.java:510) at com.thesearchagency.admax.algorithms.sabb.AdMaxSABBAlgorithm.applyAlgorithm(AdMaxSABBAlgorithm.java:289) at com.thesearchagency.admax.AdMaxSummarizer$AccountWorker.call(AdMaxSummarizer.java:639) at com.thesearchagency.admax.AdMaxSummarizer$AccountWorker.call(AdMaxSummarizer.java:504) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 2019-01-09 13:52:10.586 (2) [P1T1]: Warning, could not get default campaigns for a distribution: 3 to an account 2502. 2019-01-09 13:52:10.586 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,sloshFactor,budgetType,resellerID from `st-tracker`.admaxSloshFactors where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.594 (4) [P1T1]: preparing SQL statement [select admaxCampaignID,referenceID,createTime,refund from `st-tracker`.admaxCampaignRefunds where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.595 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,createTime,type,carryoverAdjustment from `st-tracker`.admaxCarryoverAdjustments where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.597 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,sloshFactor,budgetType,resellerID from `st-tracker`.admaxSloshFactors where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.598 (4) [P1T1]: preparing SQL statement [select admaxCampaignID,referenceID,createTime,refund from `st-tracker`.admaxCampaignRefunds where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.599 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,createTime,type,carryoverAdjustment from `st-tracker`.admaxCarryoverAdjustments where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.600 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,sloshFactor,budgetType,resellerID from `st-tracker`.admaxSloshFactors where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.601 (4) [P1T1]: preparing SQL statement [select admaxCampaignID,referenceID,createTime,refund from `st-tracker`.admaxCampaignRefunds where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.602 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,createTime,type,carryoverAdjustment from `st-tracker`.admaxCarryoverAdjustments where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.603 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,sloshFactor,budgetType,resellerID from `st-tracker`.admaxSloshFactors where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.605 (4) [P1T1]: preparing SQL statement [select admaxCampaignID,referenceID,createTime,refund from `st-tracker`.admaxCampaignRefunds where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.606 (4) [P1T1]: preparing SQL statement [select ID,admaxCampaignID,createTime,type,carryoverAdjustment from `st-tracker`.admaxCarryoverAdjustments where admaxCampaignID=? and date(createTime)=?] 2019-01-09 13:52:10.607 (4) [P1T1]: preparing SQL statement [select sum(data.commonCarryover) from (select distinct c.admaxCampaignID, c.commonCarryover from admaxCampaigns ac join admaxCarryovers c on c.admaxCampaignID = ac.ID and c.type=? and c.date = least(?, (select max(last_carryover.date) from admaxCarryovers last_carryover where last_carryover.admaxCampaignID = ac.ID and last_carryover.type = c.type)) join admaxNominalMonthlyBudgets anmb on anmb.admaxCampaignID = ac.ID and anmb.startDate = (select max(last_nom_budget.startDate) from admaxNominalMonthlyBudgets last_nom_budget where last_nom_budget.admaxCampaignID = ac.ID) join admaxSearchEngineAccountMap aseam on aseam.admaxCampaignID = ac.ID join tsacommon.searchEngineAccounts sea on sea.ID = aseam.searchEngineAccountID where ac.accountID=? and ac.status=? and ((anmb.endDate is null) or (? > anmb.endDate)) and sea.distributionID = ?) data] 2019-01-09 13:52:10.612 (4) [P1T1]: Sorted campaign order: [com.thesearchagency.admax.algorithms.common.AdMaxDataCollection@3393b18b, com.thesearchagency.admax.algorithms.common.AdMaxDataCollection@50b81290, com.thesearchagency.admax.algorithms.common.AdMaxDataCollection@34ec8c89, com.thesearchagency.admax.algorithms.common.AdMaxDataCollection@102862d2] 2019-01-09 13:52:10.613 (4) [P1T1]: There is no nominal monthly budget entry for 'today' for AdMaxCampaignID: 7779 2019-01-09 13:52:10.613 (4) [P1T1]: There is no nominal monthly budget entry for 'today' for AdMaxCampaignID: 7781 2019-01-09 13:52:10.613 (4) [P1T1]: There is no nominal monthly budget entry for 'today' for AdMaxCampaignID: 7782 2019-01-09 13:52:10.613 (4) [P1T1]: There is no nominal monthly budget entry for 'today' for AdMaxCampaignID: 7780 2019-01-09 13:52:10.615 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,budgetCapMultiplier,resellerID from `st-tracker`.admaxBudgetCapMultipliers where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.623 (3) [P1T1]: Budget capped at: 0.0 2019-01-09 13:52:10.624 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,budgetCapMultiplier,resellerID from `st-tracker`.admaxBudgetCapMultipliers where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.630 (3) [P1T1]: Budget capped at: 0.0 2019-01-09 13:52:10.630 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,budgetCapMultiplier,resellerID from `st-tracker`.admaxBudgetCapMultipliers where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.637 (3) [P1T1]: Budget capped at: 0.0 2019-01-09 13:52:10.637 (4) [P1T1]: preparing SQL statement [select ID,elementType,elementID,startDate,endDate,budgetCapMultiplier,resellerID from `st-tracker`.admaxBudgetCapMultipliers where elementType=? and elementID=? and startDate<=? and (endDate is null or endDate>=?) and resellerID=?] 2019-01-09 13:52:10.641 (3) [P1T1]: Budget capped at: 0.0 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14795 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14796 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14797 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14798 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14799 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.642 (3) [P1T1]: Pausing campaign 14800 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.643 (3) [P1T1]: Pausing campaign 14801 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.643 (3) [P1T1]: Pausing campaign 14802 because budget (0.0 ==> 0.0) is less than the minimum allowed budget (0.01) 2019-01-09 13:52:10.644 (4) [P1T1]: preparing SQL statement [insert into `st-tracker`.admaxCarryovers (admaxCampaignID, type, date, pooledCarryover, dedicatedCarryover, commonCarryover) values (?,?,?,?,?,?) on duplicate key update pooledCarryover=?,dedicatedCarryover=?,commonCarryover=?] 2019-01-09 13:52:10.650 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14795 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14796 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14797 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14798 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14799 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14800 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14801 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save balance factor for search engine account ID: 14802 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14795 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14796 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14797 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14798 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14799 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14800 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14801 2019-01-09 13:52:10.651 (4) [P1T1]: No budget for today so don't save actual budget for search engine account ID: 14802 2019-01-09 13:52:10.656 (5) [P1T1]: quoteValue(), nothing special about it=>[2502], type=class java.lang.Integer 2019-01-09 13:52:10.656 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.656 (4) [P1T1]: spike: SQL->[select * from `tsacommon`.`accounts` where (`id`=2502) limit 1] 2019-01-09 13:52:10.658 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.658 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.658 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.658 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.658 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.658 (5) [P1T1]: loading 21 columns from result set: 2019-01-09 13:52:10.658 (5) [P1T1]: loaded column: accounts.id->2502 2019-01-09 13:52:10.658 (5) [P1T1]: loaded column: accounts.description->eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:10.658 (5) [P1T1]: loaded column: accounts.tier->1 2019-01-09 13:52:10.658 (5) [P1T1]: loaded column: accounts.priority->50 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.reportClicksOut->false 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.reportConversions->false 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.conversionTypes->paid 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.isBidManaged->true 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.isActive->true 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.perfMetric->cpa 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.providesConversionData->true 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.addConversionData->true 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.fractionalConversionCounts->false 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.hideNullSources->false 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.nowTime->null 2019-01-09 13:52:10.659 (5) [P1T1]: skipping invalid column: accounts.warehouseDatabase 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.databaseInstance->1 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.warehouseInstance->1 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.isOnline->true 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.currencyCode->USD 2019-01-09 13:52:10.659 (5) [P1T1]: loaded column: accounts.timeZoneID->America/New_York 2019-01-09 13:52:10.659 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.659 (4) [P1T1]: Waiting up to 7200 seconds to lock eb11f94b-6db1-4004-997a-53a901b02366. 2019-01-09 13:52:10.659 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.659 (4) [P1T1]: spike: SQL->[SELECT GET_LOCK('com.thesearchagency.admaxlistener.dao.TSACommonDAO.eb11f94b-6db1-4004-997a-53a901b02366', 7200)] 2019-01-09 13:52:10.660 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.660 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.660 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.660 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.660 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.660 (4) [P1T1]: Locked account: eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:10.661 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.661 (4) [P1T1]: spike: SQL->[SELECT RELEASE_LOCK('com.thesearchagency.admaxlistener.dao.TSACommonDAO.eb11f94b-6db1-4004-997a-53a901b02366')] 2019-01-09 13:52:10.662 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.662 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.662 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.662 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.662 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14795 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14796 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14797 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14798 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14799 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14800 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14801 2019-01-09 13:52:10.663 (4) [P1T1]: Using value false for property sabb.biddingEnabled.google of type ALGORITHM 2019-01-09 13:52:10.663 (4) [P1T1]: Bidding not enabled for SE campaign 14802 2019-01-09 13:52:10.664 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.664 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11863")) limit 1] 2019-01-09 13:52:10.665 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.665 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.665 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.665 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.665 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.665 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.id->73604 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11863 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.665 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.665 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.665 (5) [P1T1]: quoteValue(), nothing special about it=>[11863], type=class java.lang.Integer 2019-01-09 13:52:10.665 (5) [P1T1]: quoteValue(), nothing special about it=>[73604], type=class java.lang.Integer 2019-01-09 13:52:10.665 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.666 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11863,null,73604,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.673 (5) [P1T1]: recordAsSubs("db", 7, "P1T1") 2019-01-09 13:52:10.673 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.673 (5) [P1T1]: ProfilerRec.end("db", true): ret=7 2019-01-09 13:52:10.673 (4) [P1T1]: time=0.007 2019-01-09 13:52:10.673 (4) [P1T1]: Query executed in 0.007s, 2 row(s) affected 2019-01-09 13:52:10.674 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.674 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11863","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.675 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.675 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.675 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.675 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.675 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.675 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.675 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11864")) limit 1] 2019-01-09 13:52:10.677 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.677 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.677 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.677 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.677 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.677 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.id->73605 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11864 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.677 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.677 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.677 (5) [P1T1]: quoteValue(), nothing special about it=>[11864], type=class java.lang.Integer 2019-01-09 13:52:10.677 (5) [P1T1]: quoteValue(), nothing special about it=>[73605], type=class java.lang.Integer 2019-01-09 13:52:10.677 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.677 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11864,null,73605,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.679 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.679 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.679 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.679 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.679 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.679 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.679 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11864","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.680 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.680 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.680 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.680 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.680 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.681 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.681 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11865")) limit 1] 2019-01-09 13:52:10.682 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.683 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.683 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.683 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.683 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.683 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.id->73606 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11865 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.683 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.683 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.683 (5) [P1T1]: quoteValue(), nothing special about it=>[11865], type=class java.lang.Integer 2019-01-09 13:52:10.683 (5) [P1T1]: quoteValue(), nothing special about it=>[73606], type=class java.lang.Integer 2019-01-09 13:52:10.683 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.683 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11865,null,73606,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.684 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.684 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.684 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.684 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.684 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.685 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.685 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11865","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.687 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.687 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.687 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.687 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.687 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.687 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.687 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11866")) limit 1] 2019-01-09 13:52:10.689 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.689 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.689 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.689 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.689 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.689 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.id->73607 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11866 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.689 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.690 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.690 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.690 (5) [P1T1]: quoteValue(), nothing special about it=>[11866], type=class java.lang.Integer 2019-01-09 13:52:10.690 (5) [P1T1]: quoteValue(), nothing special about it=>[73607], type=class java.lang.Integer 2019-01-09 13:52:10.690 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.690 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11866,null,73607,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.693 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.693 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.693 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.693 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.693 (4) [P1T1]: Query executed in 0.003s, 2 row(s) affected 2019-01-09 13:52:10.693 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.693 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11866","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.694 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.694 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.694 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.694 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.694 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.694 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.694 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11867")) limit 1] 2019-01-09 13:52:10.695 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.695 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.695 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.695 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.696 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.696 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.id->73608 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11867 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.696 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.696 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.696 (5) [P1T1]: quoteValue(), nothing special about it=>[11867], type=class java.lang.Integer 2019-01-09 13:52:10.696 (5) [P1T1]: quoteValue(), nothing special about it=>[73608], type=class java.lang.Integer 2019-01-09 13:52:10.696 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.696 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11867,null,73608,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.697 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.697 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.697 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.697 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.697 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.697 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.697 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11867","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.698 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.698 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.699 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.699 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.699 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.699 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.699 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11868")) limit 1] 2019-01-09 13:52:10.700 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.700 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.700 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.700 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.700 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.700 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.id->73609 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11868 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.700 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.700 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.700 (5) [P1T1]: quoteValue(), nothing special about it=>[11868], type=class java.lang.Integer 2019-01-09 13:52:10.700 (5) [P1T1]: quoteValue(), nothing special about it=>[73609], type=class java.lang.Integer 2019-01-09 13:52:10.701 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.701 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11868,null,73609,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.709 (5) [P1T1]: recordAsSubs("db", 8, "P1T1") 2019-01-09 13:52:10.709 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.709 (5) [P1T1]: ProfilerRec.end("db", true): ret=8 2019-01-09 13:52:10.709 (4) [P1T1]: time=0.008 2019-01-09 13:52:10.709 (4) [P1T1]: Query executed in 0.008s, 2 row(s) affected 2019-01-09 13:52:10.709 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.709 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11868","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.722 (5) [P1T1]: recordAsSubs("db", 13, "P1T1") 2019-01-09 13:52:10.722 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.722 (5) [P1T1]: ProfilerRec.end("db", true): ret=13 2019-01-09 13:52:10.722 (4) [P1T1]: time=0.013 2019-01-09 13:52:10.722 (4) [P1T1]: Query executed in 0.013s, 1 row(s) affected 2019-01-09 13:52:10.723 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.723 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11869")) limit 1] 2019-01-09 13:52:10.724 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.724 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.724 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.724 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.724 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.724 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.id->73610 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11869 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.724 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.724 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.724 (5) [P1T1]: quoteValue(), nothing special about it=>[11869], type=class java.lang.Integer 2019-01-09 13:52:10.724 (5) [P1T1]: quoteValue(), nothing special about it=>[73610], type=class java.lang.Integer 2019-01-09 13:52:10.725 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.725 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11869,null,73610,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.726 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.726 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.726 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.726 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.726 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.726 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.726 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11869","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.727 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.728 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.728 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.728 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.728 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.728 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.728 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11870")) limit 1] 2019-01-09 13:52:10.729 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.729 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.729 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.729 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.729 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.729 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.id->73611 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11870 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.729 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.729 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.729 (5) [P1T1]: quoteValue(), nothing special about it=>[11870], type=class java.lang.Integer 2019-01-09 13:52:10.729 (5) [P1T1]: quoteValue(), nothing special about it=>[73611], type=class java.lang.Integer 2019-01-09 13:52:10.730 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.730 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11870,null,73611,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.731 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.731 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.731 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.731 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.731 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.731 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.731 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11870","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.732 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.732 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.732 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.732 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.732 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.732 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.733 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11941")) limit 1] 2019-01-09 13:52:10.734 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.734 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.734 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.734 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.734 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.734 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.id->73850 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11941 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.734 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.734 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.734 (5) [P1T1]: quoteValue(), nothing special about it=>[11941], type=class java.lang.Integer 2019-01-09 13:52:10.734 (5) [P1T1]: quoteValue(), nothing special about it=>[73850], type=class java.lang.Integer 2019-01-09 13:52:10.734 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.734 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11941,null,73850,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.735 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.735 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.735 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.735 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.735 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.736 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.736 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11941","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.737 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.737 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.737 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.737 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.737 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.737 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.737 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11942")) limit 1] 2019-01-09 13:52:10.738 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.738 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.738 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.738 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.738 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.738 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.738 (5) [P1T1]: loaded column: dataAvailability.id->73851 2019-01-09 13:52:10.738 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.738 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11942 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.739 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.739 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.739 (5) [P1T1]: quoteValue(), nothing special about it=>[11942], type=class java.lang.Integer 2019-01-09 13:52:10.739 (5) [P1T1]: quoteValue(), nothing special about it=>[73851], type=class java.lang.Integer 2019-01-09 13:52:10.739 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.739 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11942,null,73851,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.740 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.740 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.740 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.740 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.740 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.740 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.740 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11942","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.742 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.742 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.742 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.742 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.742 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.742 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.742 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11943")) limit 1] 2019-01-09 13:52:10.746 (5) [P1T1]: recordAsSubs("db", 4, "P1T1") 2019-01-09 13:52:10.746 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.746 (5) [P1T1]: ProfilerRec.end("db", true): ret=4 2019-01-09 13:52:10.746 (4) [P1T1]: time=0.004 2019-01-09 13:52:10.746 (4) [P1T1]: Query executed in 0.004s 2019-01-09 13:52:10.746 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.id->73852 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11943 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.746 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.746 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.746 (5) [P1T1]: quoteValue(), nothing special about it=>[11943], type=class java.lang.Integer 2019-01-09 13:52:10.746 (5) [P1T1]: quoteValue(), nothing special about it=>[73852], type=class java.lang.Integer 2019-01-09 13:52:10.746 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.746 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11943,null,73852,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.748 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.748 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.748 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.748 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.748 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.748 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.748 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11943","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.749 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.749 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.749 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.749 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.749 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.749 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.749 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11944")) limit 1] 2019-01-09 13:52:10.751 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.752 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.752 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.752 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.752 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.752 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.id->73853 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11944 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.752 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.752 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.752 (5) [P1T1]: quoteValue(), nothing special about it=>[11944], type=class java.lang.Integer 2019-01-09 13:52:10.752 (5) [P1T1]: quoteValue(), nothing special about it=>[73853], type=class java.lang.Integer 2019-01-09 13:52:10.752 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.752 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11944,null,73853,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.755 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.755 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.755 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.755 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.755 (4) [P1T1]: Query executed in 0.003s, 2 row(s) affected 2019-01-09 13:52:10.755 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.755 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11944","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.759 (5) [P1T1]: recordAsSubs("db", 4, "P1T1") 2019-01-09 13:52:10.759 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.759 (5) [P1T1]: ProfilerRec.end("db", true): ret=4 2019-01-09 13:52:10.759 (4) [P1T1]: time=0.004 2019-01-09 13:52:10.759 (4) [P1T1]: Query executed in 0.004s, 1 row(s) affected 2019-01-09 13:52:10.759 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.759 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11945")) limit 1] 2019-01-09 13:52:10.761 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.761 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.761 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.761 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.761 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.761 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.id->73854 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11945 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.761 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.761 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.761 (5) [P1T1]: quoteValue(), nothing special about it=>[11945], type=class java.lang.Integer 2019-01-09 13:52:10.761 (5) [P1T1]: quoteValue(), nothing special about it=>[73854], type=class java.lang.Integer 2019-01-09 13:52:10.761 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.761 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11945,null,73854,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.764 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.764 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.764 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.764 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.764 (4) [P1T1]: Query executed in 0.003s, 2 row(s) affected 2019-01-09 13:52:10.765 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.765 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11945","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.767 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.767 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.767 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.767 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.767 (4) [P1T1]: Query executed in 0.002s, 1 row(s) affected 2019-01-09 13:52:10.767 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.767 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11946")) limit 1] 2019-01-09 13:52:10.768 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.768 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.768 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.768 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.768 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.768 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.768 (5) [P1T1]: loaded column: dataAvailability.id->73855 2019-01-09 13:52:10.768 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.768 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11946 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.769 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.769 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.769 (5) [P1T1]: quoteValue(), nothing special about it=>[11946], type=class java.lang.Integer 2019-01-09 13:52:10.769 (5) [P1T1]: quoteValue(), nothing special about it=>[73855], type=class java.lang.Integer 2019-01-09 13:52:10.769 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.769 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11946,null,73855,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.770 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.770 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.770 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.770 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.770 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.770 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.770 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11946","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.771 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.772 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.772 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.772 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.772 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.772 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.772 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11947")) limit 1] 2019-01-09 13:52:10.773 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.773 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.773 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.773 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.773 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.773 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.id->73856 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11947 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.773 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.773 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.774 (5) [P1T1]: quoteValue(), nothing special about it=>[11947], type=class java.lang.Integer 2019-01-09 13:52:10.774 (5) [P1T1]: quoteValue(), nothing special about it=>[73856], type=class java.lang.Integer 2019-01-09 13:52:10.774 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.774 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11947,null,73856,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.776 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.776 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.776 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.776 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.776 (4) [P1T1]: Query executed in 0.002s, 2 row(s) affected 2019-01-09 13:52:10.776 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.776 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11947","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.779 (5) [P1T1]: recordAsSubs("db", 3, "P1T1") 2019-01-09 13:52:10.779 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.779 (5) [P1T1]: ProfilerRec.end("db", true): ret=3 2019-01-09 13:52:10.779 (4) [P1T1]: time=0.003 2019-01-09 13:52:10.779 (4) [P1T1]: Query executed in 0.003s, 1 row(s) affected 2019-01-09 13:52:10.779 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.779 (4) [P1T1]: spike: SQL->[select * from `dataAvailability` where ((`date`="2019-01-07 00:00:00") and (`dataSourceID`="11948")) limit 1] 2019-01-09 13:52:10.781 (5) [P1T1]: recordAsSubs("db", 2, "P1T1") 2019-01-09 13:52:10.781 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.781 (5) [P1T1]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.781 (4) [P1T1]: time=0.002 2019-01-09 13:52:10.781 (4) [P1T1]: Query executed in 0.002s 2019-01-09 13:52:10.781 (5) [P1T1]: loading 9 columns from result set: 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.id->73857 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.date->2019-01-07 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.oldUpdated->00:00:00 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.updated->2019-01-09 13:52:10.0 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.dataSourceID->11948 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.status->in progress 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.statusText->null 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.needsRerun->false 2019-01-09 13:52:10.781 (5) [P1T1]: loaded column: dataAvailability.override->false 2019-01-09 13:52:10.781 (4) [P1T1]: Record Loaded 2019-01-09 13:52:10.781 (5) [P1T1]: quoteValue(), nothing special about it=>[11948], type=class java.lang.Integer 2019-01-09 13:52:10.781 (5) [P1T1]: quoteValue(), nothing special about it=>[73857], type=class java.lang.Integer 2019-01-09 13:52:10.781 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.781 (4) [P1T1]: spike: SQL->[replace into `dataAvailability` (`date`,`dataSourceID`,`statusText`,`id`,`override`,`updated`,`oldUpdated`,`status`,`needsRerun`) values ("2019-01-07",11948,null,73857,"false","2019-01-09 13:52:10","00:00:00","success","false")], genflags=yes 2019-01-09 13:52:10.782 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.782 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.782 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.782 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.782 (4) [P1T1]: Query executed in 0.001s, 2 row(s) affected 2019-01-09 13:52:10.783 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.783 (4) [P1T1]: spike: SQL->[insert into `dataAvailabilityHistory` (`date`,`dataSourceID`,`override`,`updated`,`status`) values ("2019-01-07 00:00:00","11948","false","2019-01-09 13:52:10","success")], genflags=yes 2019-01-09 13:52:10.784 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.784 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.784 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.784 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.784 (4) [P1T1]: Query executed in 0.001s, 1 row(s) affected 2019-01-09 13:52:10.784 (3) [P1T1]: Finished applying algorithm [SABB] for account [2502] in distributions [3, 178, 147] 2019-01-09 13:52:10.784 (5) [P1T1]: ProfilerRec.start("db",true) in Thread [P1T1] 2019-01-09 13:52:10.784 (4) [P1T1]: spike: SQL->[SELECT RELEASE_LOCK('com.thesearchagency.admax.common.AdMaxCampaignMigrator.2502')] 2019-01-09 13:52:10.785 (5) [P1T1]: recordAsSubs("db", 1, "P1T1") 2019-01-09 13:52:10.785 (5) [P1T1]: addSub("db") 2019-01-09 13:52:10.785 (5) [P1T1]: ProfilerRec.end("db", true): ret=1 2019-01-09 13:52:10.785 (4) [P1T1]: time=0.001 2019-01-09 13:52:10.785 (4) [P1T1]: Query executed in 0.001s 2019-01-09 13:52:10.785 (3) [P1T1]: Released migrator lock for Account ID #2502 2019-01-09 13:52:10.785 (4) [P1T1]: Returning Object to Pool 2019-01-09 13:52:10.785 (4) [P1T1]: Returning Object to Pool 2019-01-09 13:52:10.785 (4) [P1T1]: Returning Object to Pool 2019-01-09 13:52:10.785 (4) [P1T1]: Returning Object to Pool 2019-01-09 13:52:10.785 (3) [P1T1]: Completed Account ID #2502: eb11f94b-6db1-4004-997a-53a901b02366 2019-01-09 13:52:10.788 (4) [main]: closed [jdbc:mysql://tsacommon-01-write/tsacommon?useUnicode=true&zeroDateTimeBehavior=convertToNull&dontTrackOpenResources=true&jdbcCompliantTruncation=false&useServerPrepStmts=false&rewriteBatchedStatements=true] 2019-01-09 13:52:10.788 (3) [main]: ------------AdMax Summarizer done 2019-01-09 13:52:10.788 (3) [main]: =============== Completed AdMax Summarizer ================ 2019-01-09 13:52:10.788 (4) [main]: DataCache: Found key global-logging 2019-01-09 13:52:10.788 (4) [main]: Took Existing Object from Pool 2019-01-09 13:52:10.788 (5) [main]: ProfilerRec.start("db",true) in Thread [main] 2019-01-09 13:52:10.788 (4) [main]: spike: SQL->[select @@version] 2019-01-09 13:52:10.790 (2) [main]: Warning(s) detected in statement [select @@version] executed by user [spike] 2019-01-09 13:52:10.790 (2) [main]: Table 'logging.processHistory' doesn't exist (1146) 2019-01-09 13:52:10.790 (5) [main]: recordAsSubs("db", 2, "main") 2019-01-09 13:52:10.790 (5) [main]: addSub("db") 2019-01-09 13:52:10.790 (5) [main]: ProfilerRec.end("db", true): ret=2 2019-01-09 13:52:10.790 (4) [main]: time=0.002 2019-01-09 13:52:10.790 (4) [main]: Query executed in 0.002s 2019-01-09 13:52:10.790 (4) [main]: test query succeeded, returned "5.5.34" 2019-01-09 13:52:10.790 (3) [main]: Could not save process history stop 2019-01-09 13:52:10.790 (4) [main]: Returning Object to Pool 2019-01-09 13:52:10.790 (3) [main]: =============== Stats ================ 2019-01-09 13:52:10.790 (3) [main]: --- Process --- 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: ABU Process(start): Wed Jan 09 13:52:09 UTC 2019 2019-01-09 13:52:10.790 (3) [main]: ABU Process(stop): Wed Jan 09 13:52:10 UTC 2019 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: --- Accounts --- 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: Accounts Processed: 1 2019-01-09 13:52:10.790 (3) [main]: Accounts Skipped: 0 2019-01-09 13:52:10.790 (3) [main]: Accounts Processed Successfully: 1 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: --- Campaigns --- 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: Campaigns Processed: 4 2019-01-09 13:52:10.790 (3) [main]: Campaigns Processed Successfully: 0 2019-01-09 13:52:10.790 (3) [main]: Campaign Budgets Capped: 4 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: --- SE Campaigns --- 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: SE Campaigns Processed: 16 2019-01-09 13:52:10.790 (3) [main]: SE Campaigns Processed Successfully: 8 2019-01-09 13:52:10.790 (3) [main]: 2019-01-09 13:52:10.790 (3) [main]: --- Budget/Spend --- 2019-01-09 13:52:10.791 (3) [main]: 2019-01-09 13:52:10.791 (3) [main]: SE Campaign Budgets Updated: 8 2019-01-09 13:52:10.791 (3) [main]: SE Campaign Budgets Sent To Google: 0 2019-01-09 13:52:10.791 (3) [main]: SE Campaign Budgets Sent To Yahoo Japan:0 2019-01-09 13:52:10.791 (3) [main]: SE Campaign Budgets Sent To Bing: 0 2019-01-09 13:52:10.791 (3) [main]: 2019-01-09 13:52:10.791 (3) [main]: Spend Total: 0.0 2019-01-09 13:52:10.791 (3) [main]: Common Carryover Total: 3.2252836 2019-01-09 13:52:10.791 (3) [main]: 2019-01-09 13:52:10.791 (3) [main]: --- Bids --- 2019-01-09 13:52:10.791 (3) [main]: 2019-01-09 13:52:10.791 (3) [main]: New Campaign Bids: 0 2019-01-09 13:52:10.791 (3) [main]: 2019-01-09 13:52:10.791 (5) [main]: ProfilerRec.end("AdMaxSummarizer", false): ret=1560 2019-01-09 13:52:10.791 (4) [main]: time=1.56 2019-01-09 13:52:10.791 (3) [main]: AdMaxSummarizer: 1.560s 2019-01-09 13:52:10.791 (3) [main]: by key: 2019-01-09 13:52:10.791 (4) [main]: time=0.263 2019-01-09 13:52:10.791 (3) [main]: db: 0.263s 2019-01-09 13:52:10.791 (3) [main]: by thread: 2019-01-09 13:52:10.791 (4) [main]: time=0.01 2019-01-09 13:52:10.791 (3) [main]: main: 0.010s 2019-01-09 13:52:10.791 (4) [main]: time=0.253 2019-01-09 13:52:10.791 (3) [main]: P1T1: 0.253s 2019-01-09 13:52:10.791 (4) [main]: time=1.297 2019-01-09 13:52:10.791 (3) [main]: other: 1.297s