HI Guys ,
While running indexing on baseline update its failing . below are the logs ..
suggest if any idea .
2017-09-19 16:01:27,822 INFO [cas] [654845766@jetty-6] com.endeca.itl.service.CasCrawlerImpl: [trimark-last-mile-crawl] startCrawl start
2017-09-19 16:01:28,682 INFO [cas] [654845766@jetty-6] com.endeca.itl.executor.ProcessorTaskTiming: Processor tasks have timing enabled
2017-09-19 16:01:28,699 INFO [trimark-data] [654845766@jetty-6] com.endeca.itl.recordstore.impl.RecordStoreImpl: Started transaction 48 of type READ
2017-09-19 16:01:28,702 INFO [trimark-dimvals] [654845766@jetty-6] com.endeca.itl.recordstore.impl.RecordStoreImpl: Started transaction 49 of type READ
2017-09-19 16:01:28,748 INFO [cas] [654845766@jetty-6] com.endeca.itl.service.CrawlRun.[trimark-last-mile-crawl]: [trimark-last-mile-crawl] Starting crawl
2017-09-19 16:01:28,757 INFO [cas] [654845766@jetty-6] com.endeca.itl.executor.Pipeline.[trimark-last-mile-crawl]: Preparing for acquisition
2017-09-19 16:02:03,422 INFO [cas] [654845766@jetty-6] com.endeca.itl.executor.Pipeline.[trimark-last-mile-crawl]: Prepare for acquisition took 34665 ms
2017-09-19 16:02:03,422 INFO [cas] [654845766@jetty-6] com.endeca.itl.executor.TaskManager.[trimark-last-mile-crawl]: Starting crawl
2017-09-19 16:02:03,426 INFO [cas] [654845766@jetty-6] com.endeca.itl.service.CasCrawlerImpl: [trimark-last-mile-crawl] startCrawl end
2017-09-19 16:02:03,445 INFO [trimark-data] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.storage.RecordStorageFileMergeCursor: Creating merge cursor over 1 generation files
2017-09-19 16:02:03,464 INFO [trimark-data] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.RecordStoreImpl: Started baseline read of generation 15 for transaction 48 with read cursor 30
2017-09-19 16:02:03,466 INFO [cas] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.extension.source.merger.RecordStoreMergerDataSourceRuntime.[trimark-last-mile-crawl]: Finished reading 0 records from Record Store trimark-data.
2017-09-19 16:02:03,485 INFO [trimark-data] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.storage.BaselineReadCursor: Baseline read cursor skipped 0 deletes
2017-09-19 16:02:03,485 INFO [trimark-data] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.RecordStoreImpl: Ended read cursor 30 for transaction 48
2017-09-19 16:02:03,489 INFO [trimark-dimvals] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.storage.RecordStorageFileMergeCursor: Creating merge cursor over 1 generation files
2017-09-19 16:02:03,503 INFO [trimark-dimvals] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.RecordStoreImpl: Started baseline read of generation 14 for transaction 49 with read cursor 30
2017-09-19 16:02:03,639 INFO [cas] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.extension.source.merger.RecordStoreMergerDataSourceRuntime.[trimark-last-mile-crawl]: Finished reading 2064 records from Record Store trimark-dimvals.
2017-09-19 16:02:03,648 INFO [trimark-dimvals] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.storage.BaselineReadCursor: Baseline read cursor skipped 0 deletes
2017-09-19 16:02:03,648 INFO [trimark-dimvals] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.recordstore.impl.RecordStoreImpl: Ended read cursor 30 for transaction 49
2017-09-19 16:02:03,713 INFO [cas] [cas-trimark-last-mile-crawl-worker-4] com.endeca.itl.executor.output.mdex.MdexOutputSink.[trimark-last-mile-crawl]: Beginning to process dimension value records for output
2017-09-19 16:02:03,722 INFO [cas] [cas-trimark-last-mile-crawl-worker-4] com.endeca.itl.executor.output.mdex.dimension.DimensionForest.[trimark-last-mile-crawl]: Validating non-autogen dimension hierarchies took 9 ms
2017-09-19 16:02:03,802 INFO [cas] [cas-trimark-last-mile-crawl-worker-4] com.endeca.itl.executor.output.mdex.dimension.DimensionTransformer.[trimark-last-mile-crawl]: Writing non-autogen dimensions XML took 80 ms
2017-09-19 16:07:02,587 ERROR [trimark-dimension-value-id-manager] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.dvalidmgr.impl.DimensionValueIdManagerImpl: Error executing method DimensionValueIdManagerImpl.generateDimensionValueIds()
com.endeca.itl.ItlRuntimeException: java.sql.SQLException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:157)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.getDimensionValueId(DimensionValueIdMapImpl.java:165)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.createDimensionValueIdIfNotPresent(DimensionValueIdMapImpl.java:205)
at com.endeca.itl.dvalidmgr.impl.TableSwitchingDimensionValueIdMap.createDimensionValueIdIfNotPresent(TableSwitchingDimensionValueIdMap.java:158)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdManagerImpl.generateDimensionValueIds(DimensionValueIdManagerImpl.java:156)
at sun.reflect.GeneratedMethodAccessor93.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.endeca.itl.service.ServicePublisher$1.invoke(ServicePublisher.java:121)
at com.sun.proxy.$Proxy43.generateDimensionValueIds(Unknown Source)
at com.endeca.itl.executor.output.mdex.dimension.CachingDimensionValueIdManager.generateDimensionValueIds(CachingDimensionValueIdManager.java:118)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.addDimensionsValueNodes(DimensionForest.java:157)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.buildAutogenTree(DimensionForest.java:144)
at com.endeca.itl.executor.output.mdex.dimension.DimensionTransformer.transformAutogenDimension(DimensionTransformer.java:119)
at com.endeca.itl.executor.output.mdex.FullMdexOutputHandler.transformAutoGenDimensions(FullMdexOutputHandler.java:235)
at com.endeca.itl.executor.output.mdex.FullMdexOutputHandler.processDataInputClosed(FullMdexOutputHandler.java:187)
at com.endeca.itl.executor.output.mdex.MdexOutputSink.processDataInputClosed(MdexOutputSink.java:226)
at com.endeca.itl.executor.output.mdex.MdexOutputSink.notifyInputClosed(MdexOutputSink.java:204)
at com.endeca.itl.executor.TaskManager$4.work(TaskManager.java:353)
at com.endeca.itl.executor.WorkExecutor$WorkRunnable.run(WorkExecutor.java:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.endeca.itl.util.LoggingContextAwareThread.run(LoggingContextAwareThread.java:71)
Caused by: java.sql.SQLException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.hsqldb.jdbc.Util.sqlException(Util.java:418)
at org.hsqldb.jdbc.Util.sqlException(Util.java:247)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4656)
at org.hsqldb.jdbc.JDBCPreparedStatement.executeQuery(JDBCPreparedStatement.java:283)
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:155)
... 23 more
Caused by: org.hsqldb.HsqlException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.hsqldb.error.Error.error(Error.java:111)
at org.hsqldb.result.Result.newErrorResult(Result.java:1056)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:192)
at org.hsqldb.Session.executeCompiledStatement(Session.java:1331)
at org.hsqldb.Session.execute(Session.java:984)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4648)
... 25 more
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.String.<init>(String.java:166)
at java.lang.String.valueOf(String.java:2996)
at org.hsqldb.types.CharacterType.compare(CharacterType.java:427)
at org.hsqldb.index.IndexAVL.compareRowNonUnique(IndexAVL.java:539)
at org.hsqldb.index.IndexAVL.findNode(IndexAVL.java:1386)
at org.hsqldb.index.IndexAVL.findFirstRow(IndexAVL.java:972)
at org.hsqldb.RangeVariable$RangeIteratorMain.getFirstRow(RangeVariable.java:1171)
at org.hsqldb.RangeVariable$RangeIteratorMain.initialiseIterator(RangeVariable.java:1081)
at org.hsqldb.RangeVariable$RangeIteratorMain.next(RangeVariable.java:1015)
at org.hsqldb.QuerySpecification.buildResult(QuerySpecification.java:1386)
at org.hsqldb.QuerySpecification.getSingleResult(QuerySpecification.java:1305)
at org.hsqldb.QuerySpecification.getResult(QuerySpecification.java:1295)
at org.hsqldb.StatementQuery.getResult(StatementQuery.java:66)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:190)
at org.hsqldb.Session.executeCompiledStatement(Session.java:1331)
at org.hsqldb.Session.execute(Session.java:984)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4648)
at org.hsqldb.jdbc.JDBCPreparedStatement.executeQuery(JDBCPreparedStatement.java:283)
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:155)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.getDimensionValueId(DimensionValueIdMapImpl.java:165)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.createDimensionValueIdIfNotPresent(DimensionValueIdMapImpl.java:205)
at com.endeca.itl.dvalidmgr.impl.TableSwitchingDimensionValueIdMap.createDimensionValueIdIfNotPresent(TableSwitchingDimensionValueIdMap.java:158)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdManagerImpl.generateDimensionValueIds(DimensionValueIdManagerImpl.java:156)
at sun.reflect.GeneratedMethodAccessor93.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.endeca.itl.service.ServicePublisher$1.invoke(ServicePublisher.java:121)
at com.sun.proxy.$Proxy43.generateDimensionValueIds(Unknown Source)
at com.endeca.itl.executor.output.mdex.dimension.CachingDimensionValueIdManager.generateDimensionValueIds(CachingDimensionValueIdManager.java:118)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.addDimensionsValueNodes(DimensionForest.java:157)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.buildAutogenTree(DimensionForest.java:144)
2017-09-19 16:07:02,588 ERROR [cas] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.executor.TaskManager.[trimark-last-mile-crawl]: Fatal execution error performing work "MdexOutputSink-478779400 notified input closed (AsynchronousChannel-1786147159)". Aborting crawl.
com.endeca.itl.executor.FatalExecutionException: Error writing to MDEX output
at com.endeca.itl.executor.output.mdex.MdexOutputSink.processDataInputClosed(MdexOutputSink.java:232)
at com.endeca.itl.executor.output.mdex.MdexOutputSink.notifyInputClosed(MdexOutputSink.java:204)
at com.endeca.itl.executor.TaskManager$4.work(TaskManager.java:353)
at com.endeca.itl.executor.WorkExecutor$WorkRunnable.run(WorkExecutor.java:194)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.endeca.itl.util.LoggingContextAwareThread.run(LoggingContextAwareThread.java:71)
Caused by: com.endeca.itl.ItlRuntimeException: java.sql.SQLException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:157)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.getDimensionValueId(DimensionValueIdMapImpl.java:165)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.createDimensionValueIdIfNotPresent(DimensionValueIdMapImpl.java:205)
at com.endeca.itl.dvalidmgr.impl.TableSwitchingDimensionValueIdMap.createDimensionValueIdIfNotPresent(TableSwitchingDimensionValueIdMap.java:158)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdManagerImpl.generateDimensionValueIds(DimensionValueIdManagerImpl.java:156)
at sun.reflect.GeneratedMethodAccessor93.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.endeca.itl.service.ServicePublisher$1.invoke(ServicePublisher.java:121)
at com.sun.proxy.$Proxy43.generateDimensionValueIds(Unknown Source)
at com.endeca.itl.executor.output.mdex.dimension.CachingDimensionValueIdManager.generateDimensionValueIds(CachingDimensionValueIdManager.java:118)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.addDimensionsValueNodes(DimensionForest.java:157)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.buildAutogenTree(DimensionForest.java:144)
at com.endeca.itl.executor.output.mdex.dimension.DimensionTransformer.transformAutogenDimension(DimensionTransformer.java:119)
at com.endeca.itl.executor.output.mdex.FullMdexOutputHandler.transformAutoGenDimensions(FullMdexOutputHandler.java:235)
at com.endeca.itl.executor.output.mdex.FullMdexOutputHandler.processDataInputClosed(FullMdexOutputHandler.java:187)
at com.endeca.itl.executor.output.mdex.MdexOutputSink.processDataInputClosed(MdexOutputSink.java:226)
... 7 more
Caused by: java.sql.SQLException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.hsqldb.jdbc.Util.sqlException(Util.java:418)
at org.hsqldb.jdbc.Util.sqlException(Util.java:247)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4656)
at org.hsqldb.jdbc.JDBCPreparedStatement.executeQuery(JDBCPreparedStatement.java:283)
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:155)
... 23 more
Caused by: org.hsqldb.HsqlException: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.hsqldb.error.Error.error(Error.java:111)
at org.hsqldb.result.Result.newErrorResult(Result.java:1056)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:192)
at org.hsqldb.Session.executeCompiledStatement(Session.java:1331)
at org.hsqldb.Session.execute(Session.java:984)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4648)
... 25 more
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.String.<init>(String.java:166)
at java.lang.String.valueOf(String.java:2996)
at org.hsqldb.types.CharacterType.compare(CharacterType.java:427)
at org.hsqldb.index.IndexAVL.compareRowNonUnique(IndexAVL.java:539)
at org.hsqldb.index.IndexAVL.findNode(IndexAVL.java:1386)
at org.hsqldb.index.IndexAVL.findFirstRow(IndexAVL.java:972)
at org.hsqldb.RangeVariable$RangeIteratorMain.getFirstRow(RangeVariable.java:1171)
at org.hsqldb.RangeVariable$RangeIteratorMain.initialiseIterator(RangeVariable.java:1081)
at org.hsqldb.RangeVariable$RangeIteratorMain.next(RangeVariable.java:1015)
at org.hsqldb.QuerySpecification.buildResult(QuerySpecification.java:1386)
at org.hsqldb.QuerySpecification.getSingleResult(QuerySpecification.java:1305)
at org.hsqldb.QuerySpecification.getResult(QuerySpecification.java:1295)
at org.hsqldb.StatementQuery.getResult(StatementQuery.java:66)
at org.hsqldb.StatementDMQL.execute(StatementDMQL.java:190)
at org.hsqldb.Session.executeCompiledStatement(Session.java:1331)
at org.hsqldb.Session.execute(Session.java:984)
at org.hsqldb.jdbc.JDBCPreparedStatement.fetchResult(JDBCPreparedStatement.java:4648)
at org.hsqldb.jdbc.JDBCPreparedStatement.executeQuery(JDBCPreparedStatement.java:283)
at com.endeca.itl.db.PreparedStatementExecutorImpl.executeAndReturnResultSet(PreparedStatementExecutorImpl.java:155)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.getDimensionValueId(DimensionValueIdMapImpl.java:165)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdMapImpl.createDimensionValueIdIfNotPresent(DimensionValueIdMapImpl.java:205)
at com.endeca.itl.dvalidmgr.impl.TableSwitchingDimensionValueIdMap.createDimensionValueIdIfNotPresent(TableSwitchingDimensionValueIdMap.java:158)
at com.endeca.itl.dvalidmgr.impl.DimensionValueIdManagerImpl.generateDimensionValueIds(DimensionValueIdManagerImpl.java:156)
at sun.reflect.GeneratedMethodAccessor93.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.endeca.itl.service.ServicePublisher$1.invoke(ServicePublisher.java:121)
at com.sun.proxy.$Proxy43.generateDimensionValueIds(Unknown Source)
at com.endeca.itl.executor.output.mdex.dimension.CachingDimensionValueIdManager.generateDimensionValueIds(CachingDimensionValueIdManager.java:118)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.addDimensionsValueNodes(DimensionForest.java:157)
at com.endeca.itl.executor.output.mdex.dimension.DimensionForest.buildAutogenTree(DimensionForest.java:144)
2017-09-19 16:07:02,589 INFO [cas] [cas-trimark-last-mile-crawl-worker-1] com.endeca.itl.executor.TaskManager.[trimark-last-mile-crawl]: Interrupting crawl in anticipation of stopping
2017-09-19 16:07:06,634 INFO [cas] [trimark-last-mile-crawl Kill thread] com.endeca.itl.executor.TaskManager.[trimark-last-mile-crawl]: Stopping crawl
2017-09-19 16:07:18,639 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.service.CrawlRun.[trimark-last-mile-crawl]: [trimark-last-mile-crawl] Finishing crawl
2017-09-19 16:07:22,605 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.executor.output.mdex.FullDataRecordTransformer.[trimark-last-mile-crawl]: Total invalid dval assignments = 0
2017-09-19 16:07:26,688 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.executor.output.mdex.dimension.DimensionsXmlWriter.[trimark-last-mile-crawl]: Wrote 11112 dimension values: 2066 non-autogen and 9046 autogen.
2017-09-19 16:07:26,690 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.executor.output.mdex.dimension.CachingDimensionValueIdManager.[trimark-last-mile-crawl]: DimensionValueIdManager local cache hit percentage = 0.0%
2017-09-19 16:07:26,693 INFO [trimark-data] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.recordstore.impl.RecordStoreImpl: Rolled back transaction 48
2017-09-19 16:07:26,693 INFO [trimark-dimvals] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.recordstore.impl.RecordStoreImpl: Rolled back transaction 49
2017-09-19 16:07:26,700 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.executor.Pipeline.[trimark-last-mile-crawl]: End acquisition took 4096 ms
2017-09-19 16:07:26,700 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.db.Database: Size of embedded database: 7819264 kb
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Crawl State = NOT_RUNNING
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Crawl Mode = FULL_CRAWL
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Crawl Stop Cause = FAILED
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Crawl Failure Reason = Error writing to MDEX output
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Filtered = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Filtered from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Filtered Not from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Failed Records = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Filtered = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Crawled = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Documents Unsuccessfully Converted = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Documents Converted After Retry = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Crawled = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Crawled from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Documents Converted = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Delete Records Output = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Crawled Not from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: New or Updated Records Output = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Total Records Output = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Crawled from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Crawled Not from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Directories Filtered from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Files Filtered Not from Archives = 0
2017-09-19 16:07:26,703 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Crawl Seconds = 357
2017-09-19 16:07:26,704 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: Start Time = 9/19/17 4:01:28 PM EDT
2017-09-19 16:07:26,704 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]: End Time = 9/19/17 4:07:26 PM EDT
2017-09-19 16:07:26,709 INFO [cas] [trimark-last-mile-crawl-pipeline-monitor] com.endeca.itl.cas.api.CrawlMetrics.[trimark-last-mile-crawl]:
Processor Task Timing
MDEX Output (notifyInputClosed): (Total=298882.231 ms, Avg=149441.115 ms, Hits=2, StdDev=211212.675 ms, Min=91.200 ms, Max=298791.030 ms, FirstTimed=16:02:03,802, LastTimed=16:07:06,634)
Record Store Merger Data Source: (Hits=1, Value=220.053 ms, Time=16:02:03,648)
MDEX Output (processRecord): (Total=182.629 ms, Avg=0.088 ms, Hits=2064, StdDev=2.467 ms, Min=0.001 ms, Max=79.460 ms, FirstTimed=16:02:03,529, LastTimed=16:02:03,711)
Dimension Value Record Splitting (processRecord): (Total=119.272 ms, Avg=0.058 ms, Hits=2064, StdDev=0.453 ms, Min=0.006 ms, Max=9.774 ms, FirstTimed=16:02:03,529, LastTimed=16:02:03,638)
Dimension Value Record Splitting (notifyInputClosed): (Hits=1, Value=0.116 ms, Time=16:02:03,648)