==================== Test output for //tensorflow/python/data/experimental/kernel_tests/service:distributed_save_ft_test (shard 1 of 17): 2024-04-02 05:19:51.547106: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. Running tests under Python 3.11.6: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/python_aarch64-unknown-linux-gnu/bin/python3 [ RUN ] SnapshotFtTest.testDatasetRecoversAndCompletes_test_mode_eager_tfapiversion_1_numworkers_1 [ SKIPPED ] SnapshotFtTest.testDatasetRecoversAndCompletes_test_mode_eager_tfapiversion_1_numworkers_1 [ RUN ] SnapshotFtTest.testLargeMultiSourceSnapshotRecoversAndCompletes_test_mode_graph_tfapiversion_1_numsources_1_numworkers_3 [ SKIPPED ] SnapshotFtTest.testLargeMultiSourceSnapshotRecoversAndCompletes_test_mode_graph_tfapiversion_1_numsources_1_numworkers_3 [ RUN ] SnapshotFtTest.testNestedDataset_test_mode_eager_tfapiversion_2_numworkers_1 2024-04-02 05:19:54.337938: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpwxf31s9r/tf_data_dispatcher_journal 2024-04-02 05:19:54.338024: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:19:54.338814: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpwxf31s9r" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:19:54.338853: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:32853 2024-04-02 05:19:54.338865: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:19:54.345670: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:32853. Worker config: protocol: "grpc" dispatcher_address: "localhost:32853" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:19:54.345879: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:38555 WARNING: All log messages before absl::InitializeLog() is called are written to STDERR I0000 00:00:1712035194.723153 1462461 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot I0000 00:00:1712035194.807943 1462461 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot 2024-04-02 05:19:54.808983: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 38555 I0000 00:00:1712035194.885129 1462456 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, created stream_0 and assigned to localhost:38555 2024-04-02 05:19:54.925254: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 32853 2024-04-02 05:19:54.926135: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpwxf31s9r/tf_data_dispatcher_journal 2024-04-02 05:19:54.926410: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 77us. 2024-04-02 05:19:54.999001: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream: 0, compression: SNAPPY } 2024-04-02 05:19:54.999658: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream 0, chunk 0. I0000 00:00:1712035195.111311 1464968 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot 2024-04-02 05:19:55.111572: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config port: 32853 protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpwxf31s9r" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:19:55.111673: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:32853 2024-04-02 05:19:55.111691: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035195.148831 1465636 parallel_tfrecord_writer.cc:167] Writing TFRecord of 10B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__71ad9dcbfef1ee7d_ldcg-aarch64-02-c6f5a2f2-1439200-6151641a4140f.tfrecord*. 2024-04-02 05:19:55.149519: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:124] tf.data service snapshot writer is cancelled: CANCELLED: The tf.data service snapshot writer has been cancelled. 2024-04-02 05:19:55.219034: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream: 0, compression: SNAPPY } 2024-04-02 05:19:55.219101: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:32853. Worker config: port: 38555 protocol: "grpc" dispatcher_address: "localhost:32853" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:19:55.219287: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:38555 2024-04-02 05:19:55.219541: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream 0, chunk 0. 2024-04-02 05:19:55.635369: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 4950, chunk size: 48.3398KB. 2024-04-02 05:19:55.636209: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot/streams/stream_0/checkpoints/checkpoint_4_4950. Checkpointing distributed tf.data snapshot writer took 779us 2024-04-02 05:19:55.636986: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot/streams/stream_0/checkpoints 2024-04-02 05:19:55.637299: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot, stream: 0, compression: SNAPPY } I0000 00:00:1712035195.722602 1467839 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpvvo7kbyx/tmpmr8z7cox/tf_data_snapshot 2024-04-02 05:19:55.835142: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 38555 2024-04-02 05:19:55.852408: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 32853 [ OK ] SnapshotFtTest.testNestedDataset_test_mode_eager_tfapiversion_2_numworkers_1 [ RUN ] SnapshotFtTest.testNonrepeatedDatasetDoesntProduceSecondRepetitionDir_test_mode_graph_tfapiversion_1_numsources_3_numworkers_5 [ SKIPPED ] SnapshotFtTest.testNonrepeatedDatasetDoesntProduceSecondRepetitionDir_test_mode_graph_tfapiversion_1_numsources_3_numworkers_5 [ RUN ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_eager_tfapiversion_2_numelements_1000_numrepetitions_10_numworkers_1 2024-04-02 05:19:55.866464: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7a4erfm2/tf_data_dispatcher_journal 2024-04-02 05:19:55.866561: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:19:55.866895: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7a4erfm2" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:19:55.866936: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:39165 2024-04-02 05:19:55.866949: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:19:55.975672: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:39165. Worker config: protocol: "grpc" dispatcher_address: "localhost:39165" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:19:55.975939: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:33565 I0000 00:00:1712035195.991134 1471203 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot I0000 00:00:1712035196.238806 1471203 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot 2024-04-02 05:19:56.239940: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 33565 I0000 00:00:1712035196.355159 1471207 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, created stream_0 and assigned to localhost:33565 2024-04-02 05:19:56.356785: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 39165 2024-04-02 05:19:56.357614: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7a4erfm2/tf_data_dispatcher_journal 2024-04-02 05:19:56.357809: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 68us. 2024-04-02 05:19:56.371557: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream: 0, compression: SNAPPY } 2024-04-02 05:19:56.372127: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream 0, chunk 0. I0000 00:00:1712035196.375814 1473409 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot 2024-04-02 05:19:56.376046: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config port: 39165 protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7a4erfm2" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:19:56.376133: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:39165 2024-04-02 05:19:56.376150: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035196.377023 1473603 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__bfc15af137c92d8c_ldcg-aarch64-02-e0c1b8d4-1439200-6151641b90570.tfrecord*. 2024-04-02 05:19:56.377495: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:124] tf.data service snapshot writer is cancelled: CANCELLED: The tf.data service snapshot writer has been cancelled. 2024-04-02 05:19:56.536528: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream: 0, compression: SNAPPY } 2024-04-02 05:19:56.536593: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:39165. Worker config: port: 33565 protocol: "grpc" dispatcher_address: "localhost:39165" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:19:56.536776: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:33565 2024-04-02 05:19:56.537080: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream 0, chunk 0. 2024-04-02 05:19:57.376288: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035197.433133 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__728ac40485585f13_ldcg-aarch64-02-809b8c4b-1439200-6151641bb89c2.tfrecord*. 2024-04-02 05:19:58.376435: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035198.435503 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__728ac40485585f13_ldcg-aarch64-02-809b8c4b-1439200-6151641bb89c2.tfrecord*. 2024-04-02 05:19:59.385025: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035199.475806 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__728ac40485585f13_ldcg-aarch64-02-809b8c4b-1439200-6151641bb89c2.tfrecord*. 2024-04-02 05:20:00.385186: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035200.545803 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:01.385326: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035201.595795 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:02.385484: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035202.655864 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. I0000 00:00:1712035202.947629 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 1. I0000 00:00:1712035202.951126 1508470 snapshot_manager.cc:775] Starting repetition_1 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 2024-04-02 05:20:03.385638: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035203.805821 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:04.385803: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035204.865895 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:05.385980: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035205.949303 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:06.386141: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035206.995870 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:07.386320: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035208.135007 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:08.386471: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035209.155818 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__728ac40485585f13_ldcg-aarch64-02-809b8c4b-1439200-6151641bb89c2.tfrecord*. 2024-04-02 05:20:09.386606: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035209.637123 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 2. I0000 00:00:1712035209.640352 1549998 snapshot_manager.cc:775] Starting repetition_2 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 I0000 00:00:1712035210.276199 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__728ac40485585f13_ldcg-aarch64-02-809b8c4b-1439200-6151641bb89c2.tfrecord*. 2024-04-02 05:20:10.386747: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035211.335992 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__273bb2f5eafc29b6_ldcg-aarch64-02-d068f59-1439200-6151641bb89c7.tfrecord*. 2024-04-02 05:20:11.386905: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:12.387060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035212.396053 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__840f4009c1c07ebb_ldcg-aarch64-02-d068f59-1439200-6151642a101d9.tfrecord*. 2024-04-02 05:20:13.387213: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035213.396400 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__6dcbf27e9fc0699f_ldcg-aarch64-02-809b8c4b-1439200-6151642a08fd1.tfrecord*. I0000 00:00:1712035214.028419 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 3. I0000 00:00:1712035214.067201 1572044 snapshot_manager.cc:775] Starting repetition_3 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 2024-04-02 05:20:14.387359: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035214.506233 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__840f4009c1c07ebb_ldcg-aarch64-02-d068f59-1439200-6151642a101d9.tfrecord*. 2024-04-02 05:20:15.387509: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035215.576050 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__6dcbf27e9fc0699f_ldcg-aarch64-02-809b8c4b-1439200-6151642a08fd1.tfrecord*. 2024-04-02 05:20:16.387657: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035216.647667 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__6dcbf27e9fc0699f_ldcg-aarch64-02-809b8c4b-1439200-6151642a08fd1.tfrecord*. 2024-04-02 05:20:17.387800: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035217.695723 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__840f4009c1c07ebb_ldcg-aarch64-02-d068f59-1439200-6151642a101d9.tfrecord*. I0000 00:00:1712035217.949030 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 4. I0000 00:00:1712035217.952103 1594271 snapshot_manager.cc:775] Starting repetition_4 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 2024-04-02 05:20:18.391679: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035218.825569 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__840f4009c1c07ebb_ldcg-aarch64-02-d068f59-1439200-6151642a101d9.tfrecord*. 2024-04-02 05:20:19.391828: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035219.917141 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__840f4009c1c07ebb_ldcg-aarch64-02-d068f59-1439200-6151642a101d9.tfrecord*. 2024-04-02 05:20:20.391995: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035221.005639 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__33a646335e6cdf38_ldcg-aarch64-02-809b8c4b-1439200-61516432701bb.tfrecord*. 2024-04-02 05:20:21.392158: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035221.436832 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 5. I0000 00:00:1712035221.440148 1605656 snapshot_manager.cc:775] Starting repetition_5 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 I0000 00:00:1712035222.006254 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__52423fae53a7fde9_ldcg-aarch64-02-d068f59-1439200-6151643272ebc.tfrecord*. 2024-04-02 05:20:22.392316: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035223.006426 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__33a646335e6cdf38_ldcg-aarch64-02-809b8c4b-1439200-61516432701bb.tfrecord*. 2024-04-02 05:20:23.392472: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035224.016551 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__52423fae53a7fde9_ldcg-aarch64-02-d068f59-1439200-6151643272ebc.tfrecord*. 2024-04-02 05:20:24.392636: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035224.778139 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 6. I0000 00:00:1712035224.781572 1616264 snapshot_manager.cc:775] Starting repetition_6 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 I0000 00:00:1712035225.196157 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__52423fae53a7fde9_ldcg-aarch64-02-d068f59-1439200-6151643272ebc.tfrecord*. 2024-04-02 05:20:25.392799: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035226.276027 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__33a646335e6cdf38_ldcg-aarch64-02-809b8c4b-1439200-61516432701bb.tfrecord*. 2024-04-02 05:20:26.392956: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035227.335990 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__33a646335e6cdf38_ldcg-aarch64-02-809b8c4b-1439200-61516432701bb.tfrecord*. 2024-04-02 05:20:27.393126: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:28.393278: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035228.405808 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__52423fae53a7fde9_ldcg-aarch64-02-d068f59-1439200-6151643272ebc.tfrecord*. 2024-04-02 05:20:29.393418: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035229.475844 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__33a646335e6cdf38_ldcg-aarch64-02-809b8c4b-1439200-61516432701bb.tfrecord*. 2024-04-02 05:20:30.393561: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035230.416088 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 7. I0000 00:00:1712035230.419279 1641413 snapshot_manager.cc:775] Starting repetition_7 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 I0000 00:00:1712035230.626043 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__52423fae53a7fde9_ldcg-aarch64-02-d068f59-1439200-6151643272ebc.tfrecord*. 2024-04-02 05:20:31.393719: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035231.685995 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__df776f103621af71_ldcg-aarch64-02-809b8c4b-1439200-6151643bf94d3.tfrecord*. 2024-04-02 05:20:32.393885: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:32.605872: I tensorflow/core/data/service/dispatcher_impl.cc:1491] Lost worker localhost:33565 due to timeout I0000 00:00:1712035232.755886 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__9a59e4b0a87599cf_ldcg-aarch64-02-d068f59-1439200-6151643c70eff.tfrecord*. 2024-04-02 05:20:33.615035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035233.835866 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__df776f103621af71_ldcg-aarch64-02-809b8c4b-1439200-6151643bf94d3.tfrecord*. 2024-04-02 05:20:34.615226: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035234.895668 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__9a59e4b0a87599cf_ldcg-aarch64-02-d068f59-1439200-6151643c70eff.tfrecord*. 2024-04-02 05:20:35.615388: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035235.837535 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 8. I0000 00:00:1712035235.840690 1662153 snapshot_manager.cc:775] Starting repetition_8 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 I0000 00:00:1712035236.046094 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__df776f103621af71_ldcg-aarch64-02-809b8c4b-1439200-6151643bf94d3.tfrecord*. 2024-04-02 05:20:36.615564: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035237.205277 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__df776f103621af71_ldcg-aarch64-02-809b8c4b-1439200-6151643bf94d3.tfrecord*. 2024-04-02 05:20:37.615723: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035238.256535 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__df776f103621af71_ldcg-aarch64-02-809b8c4b-1439200-6151643bf94d3.tfrecord*. 2024-04-02 05:20:38.615881: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:39.616030: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:45.097126: I tensorflow/core/data/service/dispatcher_impl.cc:1491] Lost worker localhost:33565 due to timeout I0000 00:00:1712035245.123344 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__9a59e4b0a87599cf_ldcg-aarch64-02-d068f59-1439200-6151643c70eff.tfrecord*. I0000 00:00:1712035245.636781 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 9. I0000 00:00:1712035245.640132 1682808 snapshot_manager.cc:775] Starting repetition_9 for snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, source 0 2024-04-02 05:20:46.099794: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035246.123598 1474288 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__9a59e4b0a87599cf_ldcg-aarch64-02-d068f59-1439200-6151643c70eff.tfrecord*. 2024-04-02 05:20:47.099958: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035247.186041 1474287 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__999d9f9bf6a84fe5_ldcg-aarch64-02-809b8c4b-1439200-6151644b37435.tfrecord*. I0000 00:00:1712035247.863395 1474289 snapshot_split_provider.cc:222] Reset tf.data snapshot split provider for snapshot base_path: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot" num_sources: 1 metadata { element_spec: "\212\002\004\022\000\030\t" compression: "SNAPPY" }, repetition 10. 2024-04-02 05:20:47.863919: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 10000, chunk size: 136.719KB. 2024-04-02 05:20:47.864391: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/checkpoints/checkpoint_10_10000. Checkpointing distributed tf.data snapshot writer took 426us 2024-04-02 05:20:47.865358: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot/streams/stream_0/checkpoints 2024-04-02 05:20:47.865696: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot, stream: 0, compression: SNAPPY } I0000 00:00:1712035247.945020 1692785 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpj1wr_lgw/tmpqfonir3t/tf_data_snapshot 2024-04-02 05:20:48.105082: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:49.105236: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:50.105383: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:51.105531: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:20:51.709237: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence 2024-04-02 05:20:51.709781: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence 2024-04-02 05:20:51.716913: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 33565 2024-04-02 05:20:51.718767: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 39165 [ OK ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_eager_tfapiversion_2_numelements_1000_numrepetitions_10_numworkers_1 [ RUN ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_graph_tfapiversion_1_numelements_1_numrepetitions_10_numworkers_3 [ SKIPPED ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_graph_tfapiversion_1_numelements_1_numrepetitions_10_numworkers_3 [ RUN ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_graph_tfapiversion_2_numelements_2_numrepetitions_1_numworkers_1 2024-04-02 05:20:52.429211: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmplgf41c5f/tf_data_dispatcher_journal 2024-04-02 05:20:52.429282: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:20:52.429526: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmplgf41c5f" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:20:52.429561: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:36615 2024-04-02 05:20:52.429600: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:20:52.431793: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:36615. Worker config: protocol: "grpc" dispatcher_address: "localhost:36615" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:20:52.431998: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:37551 WARNING:tensorflow:From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/contextlib.py:105: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `self.session()` or `self.cached_session()` instead. W0402 05:20:52.442392 281473498444832 deprecation.py:50] From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/contextlib.py:105: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `self.session()` or `self.cached_session()` instead. 2024-04-02 05:20:52.448462: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled I0000 00:00:1712035252.460398 1715708 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot I0000 00:00:1712035252.473708 1715708 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot 2024-04-02 05:20:52.475049: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 37551 2024-04-02 05:20:52.476867: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 36615 2024-04-02 05:20:52.477582: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmplgf41c5f/tf_data_dispatcher_journal 2024-04-02 05:20:52.477770: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 62us. I0000 00:00:1712035252.491700 1716073 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot 2024-04-02 05:20:52.491923: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config port: 36615 protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmplgf41c5f" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:20:52.492034: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:36615 2024-04-02 05:20:52.492060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035252.495807 1716225 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot, created stream_0 and assigned to localhost:37551 2024-04-02 05:20:52.509925: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot, stream: 0, compression: SNAPPY } 2024-04-02 05:20:52.509982: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:36615. Worker config: port: 37551 protocol: "grpc" dispatcher_address: "localhost:36615" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:20:52.510176: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:37551 2024-04-02 05:20:52.510424: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot, stream 0, chunk 0. I0000 00:00:1712035252.511435 1716398 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__1ac4f467ce14e9fc_ldcg-aarch64-02-2df815b1-1439200-6151645119fda.tfrecord*. 2024-04-02 05:20:52.512270: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 2, chunk size: 28B. 2024-04-02 05:20:52.512699: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot/streams/stream_0/checkpoints/checkpoint_2_2. Checkpointing distributed tf.data snapshot writer took 398us 2024-04-02 05:20:52.513056: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot/streams/stream_0/checkpoints 2024-04-02 05:20:52.513318: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot, stream: 0, compression: SNAPPY } I0000 00:00:1712035252.611077 1716071 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpounefyvr/tmprxxpg4bn/tf_data_snapshot 2024-04-02 05:20:52.815208: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence [[{{node IteratorGetNext}}]] 2024-04-02 05:20:52.817200: W tensorflow/core/framework/local_rendezvous.cc:404] Local rendezvous is aborting with status: OUT_OF_RANGE: End of sequence [[{{node IteratorGetNext}}]] 2024-04-02 05:20:52.818449: I tensorflow/core/data/service/server_lib.cc:94] Shut down WorkerServer server running at port 37551 2024-04-02 05:20:52.820313: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 36615 [ OK ] SnapshotFtTest.testRepeatedDatasetRecoversAndCompletes_test_mode_graph_tfapiversion_2_numelements_2_numrepetitions_1_numworkers_1 [ RUN ] SnapshotFtTest.testSnapshotRecoveryFailsWithBadSourceName_test_mode_graph_tfapiversion_2_badsourcedirname_sourcex 2024-04-02 05:20:52.832294: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpt8wlvxud/tf_data_dispatcher_journal 2024-04-02 05:20:52.832373: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:20:52.832636: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpt8wlvxud" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:20:52.832669: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:40369 2024-04-02 05:20:52.832684: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive I0000 00:00:1712035252.852376 1717931 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7yuctbj9/tmp8y56fx10/tf_data_snapshot I0000 00:00:1712035252.866299 1717931 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp7yuctbj9/tmp8y56fx10/tf_data_snapshot 2024-04-02 05:20:52.875562: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 40369 2024-04-02 05:20:52.876335: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpt8wlvxud/tf_data_dispatcher_journal 2024-04-02 05:20:52.876533: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 40us. 2024-04-02 05:20:52.892347: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:20:52.892814: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 40369 [ OK ] SnapshotFtTest.testSnapshotRecoveryFailsWithBadSourceName_test_mode_graph_tfapiversion_2_badsourcedirname_sourcex [ RUN ] SnapshotFtTest.testSnapshotRecoveryFailsWithBadSplitNames_test_mode_graph_tfapiversion_2_badsplitfilename_split01 2024-04-02 05:20:52.898480: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpo4fywzuw/tf_data_dispatcher_journal 2024-04-02 05:20:52.898546: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:20:52.898839: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpo4fywzuw" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:20:52.898878: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:35575 2024-04-02 05:20:52.898895: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive I0000 00:00:1712035252.919109 1718362 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp8y7o1bj9/tmpmfhg4pw5/tf_data_snapshot I0000 00:00:1712035252.932498 1718362 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp8y7o1bj9/tmpmfhg4pw5/tf_data_snapshot 2024-04-02 05:20:52.936433: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 35575 2024-04-02 05:20:52.937099: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpo4fywzuw/tf_data_dispatcher_journal 2024-04-02 05:20:52.937264: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 46us. 2024-04-02 05:20:52.950319: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:20:52.950718: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 35575 [ OK ] SnapshotFtTest.testSnapshotRecoveryFailsWithBadSplitNames_test_mode_graph_tfapiversion_2_badsplitfilename_split01 [ RUN ] SnapshotFtTest.testSnapshotRecoveryFailsWithDuplicateGlobalIndexInSplitName_test_mode_eager_tfapiversion_2 2024-04-02 05:20:52.956055: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpax0bkubp/tf_data_dispatcher_journal 2024-04-02 05:20:52.956132: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:20:52.956439: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpax0bkubp" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 3 2024-04-02 05:20:52.956464: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:33225 2024-04-02 05:20:52.956514: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive I0000 00:00:1712035252.963061 1718590 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpqi9g0xa6/tmpxnf80qgx/tf_data_snapshot I0000 00:00:1712035252.976738 1718590 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpqi9g0xa6/tmpxnf80qgx/tf_data_snapshot 2024-04-02 05:20:52.981139: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 33225 2024-04-02 05:20:52.981783: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpax0bkubp/tf_data_dispatcher_journal 2024-04-02 05:20:52.981939: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 36us. 2024-04-02 05:20:52.995174: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:20:52.995568: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 33225 [ OK ] SnapshotFtTest.testSnapshotRecoveryFailsWithDuplicateGlobalIndexInSplitName_test_mode_eager_tfapiversion_2 [ RUN ] SnapshotFtTest.testSnapshotRecoveryFailsWithOutOfOrderSplitName_test_mode_graph_tfapiversion_1 [ SKIPPED ] SnapshotFtTest.testSnapshotRecoveryFailsWithOutOfOrderSplitName_test_mode_graph_tfapiversion_1 [ RUN ] SnapshotFtTest.testWorkersDontExceedMaxStreamAssignments_test_mode_graph_tfapiversion_2_workermaxconcurrentsnapshots_2 2024-04-02 05:20:53.001764: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpsqklpfg0/tf_data_dispatcher_journal 2024-04-02 05:20:53.001844: I tensorflow/core/data/service/dispatcher_impl.cc:242] No journal found. Starting dispatcher from new state. 2024-04-02 05:20:53.002152: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpsqklpfg0" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 2 2024-04-02 05:20:53.002185: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:39261 2024-04-02 05:20:53.002198: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: INVALID_ARGUMENT: The current number of workers must be positive 2024-04-02 05:20:53.004282: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:39261. Worker config: protocol: "grpc" dispatcher_address: "localhost:39261" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:20:53.004476: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:41087 2024-04-02 05:20:53.006030: I tensorflow/core/data/service/worker_impl.cc:189] Worker registered with dispatcher running at localhost:39261. Worker config: protocol: "grpc" dispatcher_address: "localhost:39261" worker_address: "localhost:%port%" heartbeat_interval_ms: 100 dispatcher_timeout_ms: 5000 data_transfer_address: "localhost:%port%" snapshot_max_chunk_size_bytes: 16384 2024-04-02 05:20:53.006223: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data WorkerServer running at 0.0.0.0:42323 I0000 00:00:1712035253.026454 1718824 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0 I0000 00:00:1712035253.039819 1718824 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0 I0000 00:00:1712035253.059014 1718824 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1 I0000 00:00:1712035253.085912 1718824 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1 I0000 00:00:1712035253.105732 1718824 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0, created stream_0 and assigned to localhost:41087 I0000 00:00:1712035253.105911 1718821 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2 I0000 00:00:1712035253.119816 1718821 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2 I0000 00:00:1712035253.120885 1719196 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, created stream_0 and assigned to localhost:42323 2024-04-02 05:20:53.121174: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0, stream: 0, compression: SNAPPY } 2024-04-02 05:20:53.121763: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0, stream 0, chunk 0. 2024-04-02 05:20:53.137200: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream: 0, compression: SNAPPY } 2024-04-02 05:20:53.137860: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream 0, chunk 0. I0000 00:00:1712035253.141455 1719636 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3 I0000 00:00:1712035253.155362 1719636 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3 I0000 00:00:1712035253.176329 1719819 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4 I0000 00:00:1712035253.189770 1719819 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4 I0000 00:00:1712035253.211248 1719964 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5 I0000 00:00:1712035253.226173 1719964 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5 I0000 00:00:1712035253.227175 1719977 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1, created stream_0 and assigned to localhost:41087 I0000 00:00:1712035253.238662 1720234 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, created stream_0 and assigned to localhost:42323 2024-04-02 05:20:53.241982: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1, stream: 0, compression: SNAPPY } 2024-04-02 05:20:53.242545: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1, stream 0, chunk 0. I0000 00:00:1712035253.247727 1720354 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6 2024-04-02 05:20:53.254518: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream: 0, compression: SNAPPY } 2024-04-02 05:20:53.255233: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream 0, chunk 0. I0000 00:00:1712035253.262392 1720354 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6 I0000 00:00:1712035253.284012 1720912 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7 I0000 00:00:1712035253.298212 1720912 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7 I0000 00:00:1712035253.320557 1721216 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8 I0000 00:00:1712035253.334766 1721216 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8 I0000 00:00:1712035253.357761 1721481 snapshot_manager.cc:181] Starting to write tf.data snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9 I0000 00:00:1712035253.372244 1721481 snapshot_manager.cc:192] Started writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9 I0000 00:00:1712035253.525991 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__b597d3216a16c160_ldcg-aarch64-02-b00149b2-1439200-61516451af3c8.tfrecord*. 2024-04-02 05:20:53.745449: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: CANCELLED: Failed to get snapshot split: tf.data prefetched split provider is shut down.. Will retry in 109ms. 2024-04-02 05:21:07.132084: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: CANCELLED: Failed to get snapshot split: tf.data prefetched split provider is shut down.. Will retry in 116ms. I0000 00:00:1712035267.138868 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__b597d3216a16c160_ldcg-aarch64-02-b00149b2-1439200-61516451af3c8.tfrecord*. 2024-04-02 05:21:07.242891: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:124] tf.data service snapshot writer is cancelled: CANCELLED: Failed to get snapshot split: tf.data prefetched split provider is shut down. 2024-04-02 05:21:07.280246: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:124] tf.data service snapshot writer is cancelled: CANCELLED: Failed to get snapshot split: tf.data prefetched split provider is shut down. 2024-04-02 05:21:07.325617: W tensorflow/core/data/service/worker_impl.cc:590] Failed to send heartbeat to dispatcher: UNAVAILABLE: Failed to perform worker heartbeat: Socket closed 2024-04-02 05:21:07.335203: W tensorflow/core/data/service/worker_impl.cc:590] Failed to send heartbeat to dispatcher: UNAVAILABLE: Failed to perform worker heartbeat: Socket closed 2024-04-02 05:21:07.345315: I tensorflow/core/data/service/server_lib.cc:94] Shut down DispatchServer server running at port 39261 2024-04-02 05:21:07.346058: I tensorflow/core/data/service/dispatcher_impl.cc:235] Attempting to restore dispatcher state from journal in /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpsqklpfg0/tf_data_dispatcher_journal 2024-04-02 05:21:07.346285: I tensorflow/core/data/service/dispatcher_impl.cc:252] Restored from journal in 90us. I0000 00:00:1712035267.491085 1725320 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9 I0000 00:00:1712035267.492487 1725327 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5 I0000 00:00:1712035267.504658 1725318 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3 I0000 00:00:1712035267.509714 1725333 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8 I0000 00:00:1712035267.529663 1725329 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4 I0000 00:00:1712035267.537249 1725334 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2 I0000 00:00:1712035267.569681 1725324 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0 2024-04-02 05:21:07.607757: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 111ms. I0000 00:00:1712035267.641259 1725322 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7 I0000 00:00:1712035267.657557 1725331 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1 2024-04-02 05:21:07.712768: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 104ms. 2024-04-02 05:21:07.719910: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 125ms. I0000 00:00:1712035267.725904 1725325 snapshot_manager.cc:271] Resumed writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6 2024-04-02 05:21:07.727342: I tensorflow/core/data/service/dispatcher_impl.cc:271] Started tf.data service dispatcher with config port: 39261 protocol: "grpc" work_dir: "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmpsqklpfg0" fault_tolerant_mode: true job_gc_check_interval_ms: 1000 job_gc_timeout_ms: 300000 client_timeout_ms: 300000 worker_timeout_ms: 200 worker_max_concurrent_snapshots: 2 2024-04-02 05:21:07.727433: I tensorflow/core/data/service/server_lib.cc:82] Started tf.data DispatchServer running at 0.0.0.0:39261 2024-04-02 05:21:07.729563: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:21:07.788365: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 111ms. 2024-04-02 05:21:07.818026: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 139ms. 2024-04-02 05:21:07.845755: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 183ms. 2024-04-02 05:21:07.875157: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 175ms. 2024-04-02 05:21:07.900335: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 118ms. 2024-04-02 05:21:07.958059: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 167ms. 2024-04-02 05:21:08.019624: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 174ms. 2024-04-02 05:21:08.029499: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 164ms. 2024-04-02 05:21:08.050596: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 199ms. 2024-04-02 05:21:08.126366: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 226ms. 2024-04-02 05:21:08.195178: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 280ms. 2024-04-02 05:21:08.195308: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 201ms. 2024-04-02 05:21:08.250634: I tensorflow/core/data/service/grpc_util.cc:84] Failed to Get next split for snapshot: UNAVAILABLE: Failed to get snapshot split: failed to connect to all addresses. Will retry in 215ms. I0000 00:00:1712035268.416265 1720324 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__978538cec0241508_ldcg-aarch64-02-372379d5-1439200-61516451ccbbf.tfrecord*. 2024-04-02 05:21:08.735046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035269.416595 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__b597d3216a16c160_ldcg-aarch64-02-b00149b2-1439200-61516451af3c8.tfrecord*. 2024-04-02 05:21:09.735320: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035270.416834 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__b597d3216a16c160_ldcg-aarch64-02-b00149b2-1439200-61516451af3c8.tfrecord*. 2024-04-02 05:21:10.735475: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035271.425695 1720327 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d1301aa2f3db7ad6_ldcg-aarch64-02-a13d9727-1439200-615164627190d.tfrecord*. 2024-04-02 05:21:11.735646: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035272.425839 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__61056381448a4661_ldcg-aarch64-02-b00149b2-1439200-61516462b946f.tfrecord*. 2024-04-02 05:21:12.735799: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035273.426457 1720327 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d1301aa2f3db7ad6_ldcg-aarch64-02-a13d9727-1439200-615164627190d.tfrecord*. 2024-04-02 05:21:13.735940: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035274.426813 1720327 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d1301aa2f3db7ad6_ldcg-aarch64-02-a13d9727-1439200-615164627190d.tfrecord*. 2024-04-02 05:21:14.736095: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035275.426945 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__61056381448a4661_ldcg-aarch64-02-b00149b2-1439200-61516462b946f.tfrecord*. 2024-04-02 05:21:15.736234: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035276.427206 1719466 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__61056381448a4661_ldcg-aarch64-02-b00149b2-1439200-61516462b946f.tfrecord*. 2024-04-02 05:21:16.736392: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035277.455159 1719469 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__18c83413c7d57085_ldcg-aarch64-02-a02b1b43-1439200-61516468cdf74.tfrecord*. 2024-04-02 05:21:17.596099: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:17.596605: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 452us 2024-04-02 05:21:17.597416: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1/streams/stream_0/checkpoints 2024-04-02 05:21:17.597829: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1, stream: 0, compression: SNAPPY } I0000 00:00:1712035277.666545 1760337 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_1 2024-04-02 05:21:17.736545: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035277.767917 1761187 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:17.781627: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7, stream: 0, compression: SNAPPY } 2024-04-02 05:21:17.782165: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7, stream 0, chunk 0. 2024-04-02 05:21:18.141351: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:18.141834: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 429us 2024-04-02 05:21:18.142591: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0/streams/stream_0/checkpoints 2024-04-02 05:21:18.142867: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0, stream: 0, compression: SNAPPY } I0000 00:00:1712035278.187421 1761891 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_0 I0000 00:00:1712035278.296717 1762129 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:18.335262: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6, stream: 0, compression: SNAPPY } 2024-04-02 05:21:18.335860: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6, stream 0, chunk 0. I0000 00:00:1712035278.455302 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__ed40ed51ebd00f08_ldcg-aarch64-02-72f0d0b8-1439200-61516469bb04b.tfrecord*. 2024-04-02 05:21:18.736708: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035279.455766 1762410 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__969023e7014d2d80_ldcg-aarch64-02-a02b1b43-1439200-61516469bb04f.tfrecord*. 2024-04-02 05:21:19.745037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035280.456046 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__ed40ed51ebd00f08_ldcg-aarch64-02-72f0d0b8-1439200-61516469bb04b.tfrecord*. 2024-04-02 05:21:20.745412: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035281.468963 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__64c81c3bb54d00ec_ldcg-aarch64-02-5e4a1e71-1439200-6151646933d99.tfrecord*. 2024-04-02 05:21:21.745574: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035282.485745 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__64c81c3bb54d00ec_ldcg-aarch64-02-5e4a1e71-1439200-6151646933d99.tfrecord*. 2024-04-02 05:21:22.745720: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035283.485937 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__64c81c3bb54d00ec_ldcg-aarch64-02-5e4a1e71-1439200-6151646933d99.tfrecord*. 2024-04-02 05:21:23.745846: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035284.486083 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a487ea8e08d6cd09_ldcg-aarch64-02-5e4a1e71-1439200-6151646f32a91.tfrecord*. 2024-04-02 05:21:24.745972: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035285.495568 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a487ea8e08d6cd09_ldcg-aarch64-02-5e4a1e71-1439200-6151646f32a91.tfrecord*. 2024-04-02 05:21:25.746110: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035286.496008 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a487ea8e08d6cd09_ldcg-aarch64-02-5e4a1e71-1439200-6151646f32a91.tfrecord*. 2024-04-02 05:21:26.746267: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035287.505618 1761473 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__e2279f953589c54f_ldcg-aarch64-02-e317456a-1439200-6151646f2d10c.tfrecord*. 2024-04-02 05:21:27.746418: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035288.506037 1762410 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__5d7e187832e772ec_ldcg-aarch64-02-a02b1b43-1439200-6151646fee1fa.tfrecord*. 2024-04-02 05:21:28.755018: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035289.517500 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__cc71ba49ddd02922_ldcg-aarch64-02-72f0d0b8-1439200-61516470327bc.tfrecord*. 2024-04-02 05:21:29.755468: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035290.535862 1761472 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a487ea8e08d6cd09_ldcg-aarch64-02-5e4a1e71-1439200-6151646f32a91.tfrecord*. 2024-04-02 05:21:30.755613: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035291.536229 1761473 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__e2279f953589c54f_ldcg-aarch64-02-e317456a-1439200-6151646f2d10c.tfrecord*. 2024-04-02 05:21:31.755760: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035292.545675 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__cc71ba49ddd02922_ldcg-aarch64-02-72f0d0b8-1439200-61516470327bc.tfrecord*. 2024-04-02 05:21:32.755903: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035293.575682 1761473 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__e2279f953589c54f_ldcg-aarch64-02-e317456a-1439200-6151646f2d10c.tfrecord*. 2024-04-02 05:21:33.756052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035294.575856 1762410 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__5d7e187832e772ec_ldcg-aarch64-02-a02b1b43-1439200-6151646fee1fa.tfrecord*. 2024-04-02 05:21:34.756201: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035295.585672 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__cc71ba49ddd02922_ldcg-aarch64-02-72f0d0b8-1439200-61516470327bc.tfrecord*. 2024-04-02 05:21:35.756356: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:21:35.848005: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:35.848470: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 411us 2024-04-02 05:21:35.849157: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7/streams/stream_0/checkpoints 2024-04-02 05:21:35.849424: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7, stream: 0, compression: SNAPPY } I0000 00:00:1712035295.919579 1797394 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_7 I0000 00:00:1712035296.020712 1797514 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:36.034512: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9, stream: 0, compression: SNAPPY } 2024-04-02 05:21:36.035068: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9, stream 0, chunk 0. I0000 00:00:1712035296.607766 1762408 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__42d19441f7be99f1_ldcg-aarch64-02-72f0d0b8-1439200-6151647a4c146.tfrecord*. 2024-04-02 05:21:36.609852: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:36.610323: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 436us 2024-04-02 05:21:36.611232: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6/streams/stream_0/checkpoints 2024-04-02 05:21:36.611525: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6, stream: 0, compression: SNAPPY } I0000 00:00:1712035296.640983 1798392 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_6 I0000 00:00:1712035296.742540 1798830 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:36.755847: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5, stream: 0, compression: SNAPPY } 2024-04-02 05:21:36.756355: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5, stream 0, chunk 0. 2024-04-02 05:21:36.756502: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035297.607937 1797707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d8957af708f3fdd4_ldcg-aarch64-02-b9ba6d4d-1439200-6151647a9c20a.tfrecord*. 2024-04-02 05:21:37.756655: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035298.608180 1799187 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__4814b7122e16232c_ldcg-aarch64-02-a02b1b43-1439200-6151647b4c352.tfrecord*. 2024-04-02 05:21:38.756828: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035299.608857 1797707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d8957af708f3fdd4_ldcg-aarch64-02-b9ba6d4d-1439200-6151647a9c20a.tfrecord*. 2024-04-02 05:21:39.756977: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035300.655743 1797707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__d8957af708f3fdd4_ldcg-aarch64-02-b9ba6d4d-1439200-6151647a9c20a.tfrecord*. 2024-04-02 05:21:40.757123: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035301.656835 1797707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a69def8d6945f618_ldcg-aarch64-02-b9ba6d4d-1439200-6151647fb533e.tfrecord*. 2024-04-02 05:21:41.757280: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035302.666103 1797706 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a2c54c807de56ffa_ldcg-aarch64-02-153e6f0d-1439200-6151647fb54ab.tfrecord*. 2024-04-02 05:21:42.765079: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035303.669869 1797707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a69def8d6945f618_ldcg-aarch64-02-b9ba6d4d-1439200-6151647fb533e.tfrecord*. 2024-04-02 05:21:43.765235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:21:44.529036: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:44.529587: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 496us 2024-04-02 05:21:44.530458: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9/streams/stream_0/checkpoints 2024-04-02 05:21:44.530799: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9, stream: 0, compression: SNAPPY } I0000 00:00:1712035304.597372 1817752 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_9 I0000 00:00:1712035304.673211 1799187 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__ddbac9d284ce1517_ldcg-aarch64-02-a02b1b43-1439200-61516482a8e7e.tfrecord*. 2024-04-02 05:21:44.694941: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. I0000 00:00:1712035304.698858 1817938 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:44.705524: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 10.496ms 2024-04-02 05:21:44.706477: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5/streams/stream_0/checkpoints 2024-04-02 05:21:44.706869: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5, stream: 0, compression: SNAPPY } 2024-04-02 05:21:44.720128: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4, stream: 0, compression: SNAPPY } 2024-04-02 05:21:44.720740: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4, stream 0, chunk 0. 2024-04-02 05:21:44.778670: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035304.835621 1817938 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_5 I0000 00:00:1712035304.937724 1817938 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8, created stream_0 and assigned to localhost:41087 2024-04-02 05:21:44.960827: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8, stream: 0, compression: SNAPPY } 2024-04-02 05:21:44.961371: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8, stream 0, chunk 0. I0000 00:00:1712035305.696845 1818975 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__19c471d99f1b1104_ldcg-aarch64-02-5e4a1e71-1439200-615164831f652.tfrecord*. 2024-04-02 05:21:45.785065: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035306.697023 1818313 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__4329ee9cf941e379_ldcg-aarch64-02-6997a738-1439200-61516482e4a43.tfrecord*. 2024-04-02 05:21:46.795135: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035307.697131 1818313 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__4329ee9cf941e379_ldcg-aarch64-02-6997a738-1439200-61516482e4a43.tfrecord*. 2024-04-02 05:21:47.805051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035308.705925 1818313 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__678597eb416dae40_ldcg-aarch64-02-6997a738-1439200-61516485f6ca9.tfrecord*. 2024-04-02 05:21:48.805228: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035309.719371 1818974 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__1e791464efafff_ldcg-aarch64-02-896d3507-1439200-6151648647e17.tfrecord*. 2024-04-02 05:21:49.815058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035310.726944 1818313 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__678597eb416dae40_ldcg-aarch64-02-6997a738-1439200-61516485f6ca9.tfrecord*. 2024-04-02 05:21:50.815230: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035311.727174 1818312 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__1c2d3cb7d7c38331_ldcg-aarch64-02-ef50c3af-1439200-6151648948644.tfrecord*. 2024-04-02 05:21:51.784142: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:51.784608: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 406us 2024-04-02 05:21:51.785356: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4/streams/stream_0/checkpoints 2024-04-02 05:21:51.785676: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4, stream: 0, compression: SNAPPY } 2024-04-02 05:21:51.815394: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035311.850354 1845272 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_4 I0000 00:00:1712035311.951682 1846013 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, created stream_1 and assigned to localhost:41087 2024-04-02 05:21:51.966063: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream: 1, compression: SNAPPY } 2024-04-02 05:21:51.966636: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream 1, chunk 0. 2024-04-02 05:21:51.988173: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8, stream: 0, compression: SNAPPY }. Stream 0, chunk 0, number of elements in chunk: 5000, chunk size: 68.3594KB. 2024-04-02 05:21:51.988710: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8/streams/stream_0/checkpoints/checkpoint_6_5000. Checkpointing distributed tf.data snapshot writer took 481us 2024-04-02 05:21:51.989549: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8/streams/stream_0/checkpoints 2024-04-02 05:21:51.989891: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8, stream: 0, compression: SNAPPY } I0000 00:00:1712035312.067122 1846754 snapshot_manager.cc:543] Finished writing tf.data distributed snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_8 I0000 00:00:1712035312.168694 1846745 snapshot_manager.cc:687] For snapshot at /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, created stream_1 and assigned to localhost:41087 2024-04-02 05:21:52.182804: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:120] Writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream: 1, compression: SNAPPY } 2024-04-02 05:21:52.183458: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:172] Writing distributed tf.data snapshot /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream 1, chunk 0. I0000 00:00:1712035312.735056 1846707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a96bc436fbe20d5_ldcg-aarch64-02-7f675047-1439200-61516489cda83.tfrecord*. 2024-04-02 05:21:52.825051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035313.745931 1846708 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__8659da58fff8886c_ldcg-aarch64-02-c4098d5b-1439200-61516489cdcbc.tfrecord*. 2024-04-02 05:21:53.835051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035314.757909 1848242 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__cd8444816d888807_ldcg-aarch64-02-896d3507-1439200-6151648a0295f.tfrecord*. 2024-04-02 05:21:54.835214: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035315.758030 1846708 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a0c4cc36515bb028_ldcg-aarch64-02-c4098d5b-1439200-6151648c50d07.tfrecord*. 2024-04-02 05:21:55.835368: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035316.774452 1846707 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__75a7dfd91b16956d_ldcg-aarch64-02-7f675047-1439200-6151648c54847.tfrecord*. 2024-04-02 05:21:56.835526: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035317.774557 1846708 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__a0c4cc36515bb028_ldcg-aarch64-02-c4098d5b-1439200-6151648c50d07.tfrecord*. 2024-04-02 05:21:57.835686: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:21:58.604363: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream: 1, compression: SNAPPY }. Stream 1, chunk 0, number of elements in chunk: 4785, chunk size: 65.4199KB. 2024-04-02 05:21:58.604889: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/checkpoints/checkpoint_6_4785. Checkpointing distributed tf.data snapshot writer took 467us 2024-04-02 05:21:58.605668: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3/streams/stream_1/checkpoints 2024-04-02 05:21:58.606010: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3, stream: 1, compression: SNAPPY } I0000 00:00:1712035318.774799 1848242 parallel_tfrecord_writer.cc:167] Writing TFRecord of 14B to file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2/streams/stream_1/uncommitted_chunks/chunk_0_CHUNK_SHARDS___shard__4a879df2bce648c1_ldcg-aarch64-02-896d3507-1439200-6151648cf5d14.tfrecord*. 2024-04-02 05:21:58.845119: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:21:58.931232: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:288] Checkpointing distributed tf.data snapshot writer for snapshot SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream: 1, compression: SNAPPY }. Stream 1, chunk 0, number of elements in chunk: 4847, chunk size: 66.2676KB. 2024-04-02 05:21:58.931751: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:306] Wrote checkpoint file /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2/streams/stream_1/checkpoints/checkpoint_6_4847. Checkpointing distributed tf.data snapshot writer took 459us 2024-04-02 05:21:58.932501: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:343] Deleting tf.data snapshot checkpoints directory: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2/streams/stream_1/checkpoints 2024-04-02 05:21:58.932824: I tensorflow/core/data/service/snapshot/snapshot_stream_writer.cc:135] Finished writing distributed tf.data snapshot stream: SnapshotWriterParams { base_path: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2, stream: 1, compression: SNAPPY } 2024-04-02 05:21:59.854698: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:00.855056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:01.855237: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:02.855407: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:03.856472: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:04.856626: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:05.865067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:06.865235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035327.353036 1904165 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035327.353125 1904165 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:22:07.865843: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:08.875051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:09.885386: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:10.888963: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:11.889112: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:12.904190: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:13.904341: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:14.904500: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:15.904699: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:16.905052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:17.905245: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:18.916289: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:19.925047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:20.935050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:21.935233: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:22.935408: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:23.935665: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:24.935838: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:25.942276: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:26.942451: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:27.942628: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:28.942809: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:29.942973: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:30.943430: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:31.945288: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:32.955566: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:33.965407: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:34.975049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:35.990575: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:36.990741: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:37.995043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:39.005038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:40.010482: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:41.010666: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:42.010828: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:43.010994: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:44.015025: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:45.015178: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:46.015329: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:47.015490: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:48.015659: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:49.025056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:50.025235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:51.025661: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:52.055160: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:53.055335: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:54.055489: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:55.055721: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:56.058744: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:57.058921: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:58.059089: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:22:59.059254: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:00.059420: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:01.075058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:02.075233: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:03.084906: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:04.085229: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:05.105063: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:06.105255: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:07.115062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035387.405290 2300134 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035387.406032 2312872 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:23:08.135045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:09.135211: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:10.135369: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:11.155057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:12.155235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:13.165072: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:14.185048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:15.185223: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:16.194772: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:17.194944: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:18.205045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:19.215050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:20.215219: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:21.215371: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:22.215533: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:23.225062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:24.235072: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:25.255044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:26.255214: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:27.256239: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:28.256404: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:29.256569: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:30.258715: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:31.258920: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:32.259093: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:33.265138: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:34.265308: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:35.275052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:36.275220: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:37.275370: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:38.285051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:39.298148: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:40.298346: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:41.315222: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:42.335029: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:43.345055: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:44.345227: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:45.345386: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:46.345585: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:47.355048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:48.355234: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:49.355443: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:50.355792: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:51.375142: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:52.385099: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:53.388978: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:54.389147: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:55.405036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:56.415031: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:57.415182: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:58.415343: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:23:59.425046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:00.429090: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:01.429264: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:02.429457: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:03.429619: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:04.429763: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:05.431168: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:06.431306: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:07.435056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035447.465812 2709300 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035447.466385 2660049 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:24:08.445084: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:09.445238: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:10.445406: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:11.465084: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:12.465282: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:13.465453: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:14.465990: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:15.485472: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:16.495053: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:17.495226: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:18.495400: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:19.496297: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:20.505067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:21.505246: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:22.515059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:23.535066: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:24.535234: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:25.545044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:26.555054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:27.565063: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:28.565267: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:29.575062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:30.585049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:31.595046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:32.595227: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:33.615054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:34.615221: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:35.617044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:36.635041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:37.635213: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:38.637945: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:39.641379: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:40.641558: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:41.645046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:42.645257: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:43.645437: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:44.655066: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:45.655234: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:46.665070: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:47.665322: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:48.665487: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:49.675083: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:50.687333: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:51.689701: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:52.689885: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:53.690060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:54.690360: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:55.690534: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:56.690775: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:57.690952: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:58.705048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:24:59.705215: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:00.715054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:01.725040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:02.727766: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:03.733548: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:04.735051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:05.747502: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:06.747662: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035507.515516 2921415 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035507.515826 2921964 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:25:07.755106: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:08.755261: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:09.755421: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:10.755591: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:11.755770: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:12.765047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:13.765229: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:14.765389: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:15.766108: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:16.766488: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:17.767728: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:18.775055: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:19.785043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:20.795173: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:21.805038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:22.805204: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:23.805385: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:24.805562: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:25.805891: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:26.806093: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:27.806280: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:28.815051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:29.825032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:30.835067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:31.835249: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:32.845059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:33.847550: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:34.855057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:35.855233: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:36.865070: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:37.866321: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:38.875203: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:39.875379: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:40.885053: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:41.895048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:42.905034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:43.905200: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:44.905361: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:45.905529: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:46.906076: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:47.906237: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:48.915064: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:49.915246: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:50.919820: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:51.920000: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:52.925074: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:53.945062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:54.945239: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:55.955057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:56.965049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:57.965229: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:58.965832: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:25:59.985063: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:00.985238: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:01.985407: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:02.995050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:03.996409: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:05.015040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:06.035037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:07.055071: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035567.595498 3143825 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035567.595581 3143825 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:26:08.075046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:09.079155: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:10.079327: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:11.095057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:12.095234: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:13.095414: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:14.105062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:15.115056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:16.125067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:17.135064: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:18.135258: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:19.139794: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:20.145047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:21.145266: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:22.145434: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:23.149833: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:24.175037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:25.175211: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:26.195036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:27.215035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:28.215216: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:29.225077: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:30.245047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:31.265035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:32.272007: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:33.279034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:34.279209: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:35.279379: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:36.279542: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:37.285060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:38.305041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:39.307183: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:40.315035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:41.325037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:42.325842: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:43.331324: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:44.345040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:45.346700: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:46.355110: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:47.375084: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:48.395121: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:49.395304: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:50.395483: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:51.405088: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:52.435037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:53.455122: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:54.475045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:55.485040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:56.505049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:57.505221: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:58.515051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:26:59.515631: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:00.515796: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:01.535060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:02.537689: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:03.537845: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:04.538002: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:05.538173: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:06.538347: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:07.545085: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035627.596610 3280734 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035627.596877 3305108 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:27:08.565035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:09.585032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:10.595221: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:11.605061: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:12.615037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:13.645062: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:14.715028: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:15.725054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:16.735057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:17.745060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:18.763328: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:19.775067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:20.785068: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:21.805034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:22.815043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:23.825049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:24.825306: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:25.835048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:26.835215: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:27.845047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:28.846097: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:29.846275: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:30.847516: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:31.855059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:32.865045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:33.865221: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:34.875526: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:35.895052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:36.905036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:37.915039: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:38.935028: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:39.935189: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:40.945038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:41.955042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:42.961721: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:43.961890: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:44.975050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:45.977941: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:46.978096: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:47.985038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:48.985222: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:49.985390: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:51.005044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:52.006243: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:53.006480: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:54.015124: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:55.035051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:56.035250: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:57.035414: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:58.035619: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:27:59.055052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:00.065059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:01.085034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:02.095040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:03.095337: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:04.095505: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:05.096290: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:06.096518: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:07.101090: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035687.646554 3519232 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035687.646857 3520159 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:28:08.101283: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:09.101455: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:10.101637: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:11.101815: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:12.101988: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:13.115522: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:14.125041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:15.125235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:16.135068: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:17.145037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:18.155044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:19.165049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:20.169619: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:21.170220: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:22.178397: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:23.178576: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:24.185478: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:25.185663: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:26.205051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:27.206288: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:28.215069: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:29.219961: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:30.225064: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:31.235026: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:32.253168: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:33.265128: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:34.275024: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:35.325030: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:36.339077: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:37.365056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:38.382876: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:39.389409: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:40.405443: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:41.449139: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:42.449300: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:43.455046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:44.465278: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:45.475035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:46.475377: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:47.475549: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:48.475723: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:49.475885: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:50.495040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:51.505170: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:52.515644: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:53.515911: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:54.516404: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:55.516570: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:56.525053: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:57.529440: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:58.529617: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:28:59.533168: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:00.533330: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:01.585022: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:02.585182: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:03.605041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:04.615053: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:05.625078: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:06.625250: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:07.645075: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035747.705405 3712214 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035747.705486 3712214 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:29:08.665026: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:09.665186: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:10.675053: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:11.685050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:12.690837: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:13.690993: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:14.695059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:15.695258: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:16.705076: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:17.705252: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:18.715040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:19.715902: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:20.716513: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:21.723782: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:22.735099: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:23.745039: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:24.755034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:25.765368: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:26.767665: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:27.767834: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:28.775030: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:29.795042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:30.805028: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:31.805345: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:32.805515: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:33.805662: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:34.825043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:35.835037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:36.841418: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:37.845200: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:38.855038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:39.855381: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:40.860216: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:41.860408: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:42.865035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:43.865336: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:44.865493: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:45.865654: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:46.885050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:47.905090: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:48.905257: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:49.905748: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:50.918944: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:51.919114: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:52.919288: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:53.935027: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:54.955143: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:55.955320: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:56.955490: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:57.965057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:58.975195: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:29:59.995072: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:01.015088: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:02.035032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:03.035201: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:04.045157: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:05.055027: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:06.065063: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:07.065255: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035807.715992 3915716 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035807.780454 3956439 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:30:08.075026: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:09.075206: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:10.075622: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:11.075780: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:12.085040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:12.085115: I tensorflow/core/data/service/dispatcher_impl.cc:1491] Lost worker localhost:42323 due to timeout 2024-04-02 05:30:13.085257: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:14.085484: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:15.255030: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:16.255189: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:17.255361: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:18.255553: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:19.275047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:20.275214: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:21.275453: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:22.285050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:23.285227: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:24.295951: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:25.315057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:26.325066: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:27.345037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:28.355047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:29.355207: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:30.360637: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:31.365051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:32.375066: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:33.376682: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:34.395201: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:35.415050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:36.415244: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:37.415405: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:38.425112: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:39.425308: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:40.442018: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:41.445040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:42.455052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:43.455442: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:44.465887: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:45.477391: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:46.479488: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:47.479679: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:48.495048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:49.515089: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:50.535040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:51.545046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:52.546969: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:53.565051: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:54.577170: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:55.586262: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:56.586437: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:57.591809: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:58.591969: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:30:59.595048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:00.605027: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:01.625042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:02.625207: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:03.635060: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:04.643731: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:05.649979: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:06.655045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:07.663766: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035867.776368 4175479 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035867.885503 4175479 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:31:08.663947: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:09.675036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:10.675241: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:11.675413: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:12.765033: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:13.775035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:14.775223: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:15.795034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:16.795480: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:17.805049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:18.805779: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:19.825025: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:20.825235: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:21.835036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:22.836890: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:23.855032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:24.857784: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:25.865041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:26.868984: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:27.885045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:28.885214: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:29.895032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:30.897912: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:31.898075: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:32.898433: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:33.915028: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:34.925050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:35.945030: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:36.945228: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:37.949052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:38.949262: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:39.949457: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:40.955537: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:41.955754: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:42.956253: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:43.975084: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:44.975305: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:45.975469: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:46.975720: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:47.985143: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:49.005029: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:50.005289: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:51.015043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:52.015420: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:53.064147: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:54.064350: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:55.065028: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:56.065321: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:57.085037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:58.105049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:31:59.115091: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:00.119299: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:01.119477: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:02.119668: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:03.120131: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:04.125265: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:05.135037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:06.145035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:07.155036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035927.861170 181327 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035927.964402 181327 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:32:08.157231: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:09.157412: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:10.165044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:11.175035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:12.175790: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:13.195132: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:14.195634: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:15.198657: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:16.198825: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:17.205533: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:18.205699: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:19.225042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:20.245038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:21.265037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:22.265202: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:23.285052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:24.295059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:25.305033: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:26.305197: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:27.305377: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:28.325272: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:29.325461: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:30.345045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:31.345226: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:32.345386: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:33.365095: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:34.375023: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:35.375181: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:36.385072: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:37.405063: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:38.425032: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:39.435315: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:40.438672: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:41.440080: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:42.441907: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:43.445052: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:44.455047: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:45.468986: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:46.469150: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:47.469328: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:48.469503: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:49.469669: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:50.469839: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:51.473207: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:52.473646: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:53.485042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:54.485216: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:55.495035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:56.525057: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:57.545034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:58.545610: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:32:59.545771: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:00.555451: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:01.555641: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:02.555803: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:03.565277: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:04.585049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:05.585220: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:06.585495: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:07.585709: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712035987.869265 324556 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712035987.972744 324556 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:33:08.585868: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:09.595038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:10.604698: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:11.604887: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:12.605058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:13.616058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:14.635050: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:15.645040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:16.665037: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:17.675049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:18.695036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:19.715036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:20.725038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:21.728937: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:22.745025: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:23.845039: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:24.865040: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:25.885029: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:26.885217: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:27.885387: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:28.895041: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:29.905058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:30.915034: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:31.925091: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:32.925259: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:33.925428: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:34.925829: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:35.945033: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:36.955058: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:37.955490: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:38.955639: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:39.956132: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:40.970878: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:41.971148: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:42.971309: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:43.971493: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:44.971650: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:45.985031: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:46.995031: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:47.995195: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:48.995403: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:49.995601: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:51.003319: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:52.015044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:53.015229: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:54.015418: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:55.018328: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:56.018679: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:57.018869: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:58.019043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:33:59.035031: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:00.035201: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:01.035377: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:02.038457: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:03.038644: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:04.041895: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:05.042056: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:06.045059: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:07.045226: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration I0000 00:00:1712036047.875587 493489 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_3]: 1/2 streams completed; 5000/5000 splits assigned or completed. I0000 00:00:1712036047.995667 493489 snapshot_manager.cc:648] tf.data snapshot progress [/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/_tmp/668832b5c288693bc5b6071fe73c176472howyvt/tmp3fxna93s/tmph18_dbf_/tf_data_snapshot_2]: 1/2 streams completed; 5000/5000 splits assigned or completed. 2024-04-02 05:34:08.065045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:09.065213: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:10.075045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:11.075464: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:12.105026: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:13.125084: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:14.135049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:15.145048: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:16.155044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:17.165045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:18.185038: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:19.205045: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:20.215054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:21.219352: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:22.219528: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:23.221137: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:24.231730: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:25.240580: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:26.240818: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:27.240996: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:28.242036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:29.245043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:30.265036: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:31.265246: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:32.395378: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:32.395438: I tensorflow/core/data/service/dispatcher_impl.cc:1491] Lost worker localhost:42323 due to timeout 2024-04-02 05:34:33.405054: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:34.415067: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:35.415282: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:36.435035: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:37.445044: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:38.445224: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:39.455043: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:40.465049: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:41.475121: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:42.485071: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:43.494877: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:44.495042: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:45.665046: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:46.669975: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:47.670620: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:48.670784: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration 2024-04-02 05:34:49.670958: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration -- Test timed out at 2024-04-02 05:34:50 UTC -- 2024-04-02 05:34:50.671136: W tensorflow/core/data/service/dispatcher_impl.cc:1403] Error updating the optimal number of workers metric in tf.data service AutoScaler: UNAVAILABLE: Cannot update the optimal number of workers metric because there are no reported processing and target processing times for at least one iteration Current thread 0x0000ffffa7e37420 (most recent call first): File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.py", line 92 in wait_for_snapshot File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.py", line 323 in testWorkersDontExceedMaxStreamAssignments File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/framework/test_combinations.py", line 343 in execute_test_method File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/framework/test_combinations.py", line 360 in decorated File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/testing/parameterized.py", line 314 in bound_param_test File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 579 in _callTestMethod File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 623 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 678 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 122 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 84 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 122 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 84 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/runner.py", line 217 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/main.py", line 274 in runTests File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/main.py", line 102 in __init__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/testing/absltest.py", line 2537 in _run_and_get_tests_result File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/testing/absltest.py", line 2568 in run_tests File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/testing/absltest.py", line 2156 in _run_in_app File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/testing/absltest.py", line 2049 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 51 in g_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/app.py", line 258 in _run_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/absl_py/absl/app.py", line 312 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 60 in main_wrapper File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/platform/benchmark.py", line 489 in benchmarks_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 62 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/platform/test.py", line 53 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.runfiles/org_tensorflow/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test.py", line 534 in ================================================================================ ==================== Test output for //tensorflow/python/kernel_tests/linalg:matrix_triangular_solve_op_test_cpu (shard 1 of 3): 2024-04-02 05:26:13.386586: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. Running tests under Python 3.11.6: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/python_aarch64-unknown-linux-gnu/bin/python3 [ RUN ] MatrixTriangularSolveOpTest.testEmpty INFO:tensorflow:time(__main__.MatrixTriangularSolveOpTest.testEmpty): 0.16s I0402 05:26:16.072920 281473424323616 test_util.py:2634] time(__main__.MatrixTriangularSolveOpTest.testEmpty): 0.16s [ OK ] MatrixTriangularSolveOpTest.testEmpty [ RUN ] MatrixTriangularSolveOpTest.testSolve WARNING:tensorflow:From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/contextlib.py:105: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `self.session()` or `self.cached_session()` instead. W0402 05:26:16.080428 281473424323616 deprecation.py:50] From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/contextlib.py:105: TensorFlowTestCase.test_session (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `self.session()` or `self.cached_session()` instead. 2024-04-02 05:26:16.089479: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled INFO:tensorflow:time(__main__.MatrixTriangularSolveOpTest.testSolve): 1.6s I0402 05:26:17.676262 281473424323616 test_util.py:2634] time(__main__.MatrixTriangularSolveOpTest.testSolve): 1.6s [ OK ] MatrixTriangularSolveOpTest.testSolve [ RUN ] MatrixTriangularSolveOpTest.testSolveBatchBroadcastLargerBatches -- Test timed out at 2024-04-02 05:41:11 UTC -- Current thread 0x0000ffffa3787420 (most recent call first): File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/pypi_numpy/site-packages/numpy/linalg/linalg.py", line 400 in solve File "<__array_function__ internals>", line 180 in solve File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test.py", line 89 in _verifySolve File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test.py", line 31 in _verifySolveAllWays File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test.py", line 41 in _verifySolveAllWaysReal File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test.py", line 173 in testSolveBatchBroadcastLargerBatches File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py", line 1858 in decorated File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 579 in _callTestMethod File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 623 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/case.py", line 678 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 122 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 84 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 122 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/suite.py", line 84 in __call__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/runner.py", line 217 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/main.py", line 274 in runTests File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/external/python_aarch64-unknown-linux-gnu/lib/python3.11/unittest/main.py", line 102 in __init__ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/testing/absltest.py", line 2537 in _run_and_get_tests_result File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/testing/absltest.py", line 2568 in run_tests File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/testing/absltest.py", line 2156 in _run_in_app File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/testing/absltest.py", line 2049 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 51 in g_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/app.py", line 258 in _run_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/absl_py/absl/app.py", line 312 in run File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 60 in main_wrapper File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/platform/benchmark.py", line 489 in benchmarks_main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/platform/googletest.py", line 62 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/platform/test.py", line 53 in main File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu.runfiles/org_tensorflow/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test.py", line 244 in ================================================================================ ==================== Test output for //tensorflow/python/eager:small_constants_optimizer_test_cpu: 2024-04-02 05:34:40.340335: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. Running tests under Python 3.11.6: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/python_aarch64-unknown-linux-gnu/bin/python3 [ RUN ] FunctionTest.test_grappler_optimization [ FAILED ] FunctionTest.test_grappler_optimization INFO:tensorflow:time(__main__.FunctionTest.test_grappler_optimization): 81.93s I0402 05:36:09.190711 281472974681120 test_util.py:2634] time(__main__.FunctionTest.test_grappler_optimization): 81.93s [ RUN ] FunctionTest.test_session [ SKIPPED ] FunctionTest.test_session [ RUN ] FunctionTest.test_small_constants_optimization_disabled WARNING:tensorflow:From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py:1971: is_gpu_available (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.config.list_physical_devices('GPU')` instead. W0402 05:36:09.311903 281472974681120 deprecation.py:50] From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py:1971: is_gpu_available (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.config.list_physical_devices('GPU')` instead. [ SKIPPED ] FunctionTest.test_small_constants_optimization_disabled INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_disabled): 0.0s I0402 05:36:09.312510 281472974681120 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_disabled): 0.0s [ RUN ] FunctionTest.test_small_constants_optimization_invalid_input INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_invalid_input): 0.13s I0402 05:36:09.444478 281472974681120 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_invalid_input): 0.13s [ OK ] FunctionTest.test_small_constants_optimization_invalid_input [ RUN ] FunctionTest.test_small_constants_optimization_with_grappler INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_with_grappler): 82.93s I0402 05:37:32.380311 281472974681120 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_with_grappler): 82.93s [ OK ] FunctionTest.test_small_constants_optimization_with_grappler [ RUN ] FunctionTest.test_small_constants_optimization_without_grappler INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_without_grappler): 97.63s I0402 05:39:10.019788 281472974681120 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_without_grappler): 97.63s [ OK ] FunctionTest.test_small_constants_optimization_without_grappler ====================================================================== FAIL: test_grappler_optimization (__main__.FunctionTest.test_grappler_optimization) FunctionTest.test_grappler_optimization ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py", line 1934, in decorated return f(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/eager/small_constants_optimizer_test.py", line 71, in test_grappler_optimization self.assertLess(opt_benchmark * 3, benchmark) AssertionError: 0.4002568703144789 not less than 0.3218843322247267 ---------------------------------------------------------------------- Ran 6 tests in 262.789s FAILED (failures=1, skipped=2) ================================================================================ ==================== Test output for //tensorflow/python/eager:small_constants_optimizer_test_cpu: 2024-04-02 05:39:15.626792: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. Running tests under Python 3.11.6: /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/python_aarch64-unknown-linux-gnu/bin/python3 [ RUN ] FunctionTest.test_grappler_optimization [ FAILED ] FunctionTest.test_grappler_optimization INFO:tensorflow:time(__main__.FunctionTest.test_grappler_optimization): 76.38s I0402 05:40:35.598982 281473570403360 test_util.py:2634] time(__main__.FunctionTest.test_grappler_optimization): 76.38s [ RUN ] FunctionTest.test_session [ SKIPPED ] FunctionTest.test_session [ RUN ] FunctionTest.test_small_constants_optimization_disabled WARNING:tensorflow:From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py:1971: is_gpu_available (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.config.list_physical_devices('GPU')` instead. W0402 05:40:35.891305 281473570403360 deprecation.py:50] From /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py:1971: is_gpu_available (from tensorflow.python.framework.test_util) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.config.list_physical_devices('GPU')` instead. [ SKIPPED ] FunctionTest.test_small_constants_optimization_disabled INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_disabled): 0.0s I0402 05:40:35.891969 281473570403360 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_disabled): 0.0s [ RUN ] FunctionTest.test_small_constants_optimization_invalid_input INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_invalid_input): 0.24s I0402 05:40:36.135941 281473570403360 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_invalid_input): 0.24s [ OK ] FunctionTest.test_small_constants_optimization_invalid_input [ RUN ] FunctionTest.test_small_constants_optimization_with_grappler INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_with_grappler): 74.88s I0402 05:41:51.017470 281473570403360 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_with_grappler): 74.88s [ OK ] FunctionTest.test_small_constants_optimization_with_grappler [ RUN ] FunctionTest.test_small_constants_optimization_without_grappler INFO:tensorflow:time(__main__.FunctionTest.test_small_constants_optimization_without_grappler): 59.96s I0402 05:42:50.983174 281473570403360 test_util.py:2634] time(__main__.FunctionTest.test_small_constants_optimization_without_grappler): 59.96s [ OK ] FunctionTest.test_small_constants_optimization_without_grappler ====================================================================== FAIL: test_grappler_optimization (__main__.FunctionTest.test_grappler_optimization) FunctionTest.test_grappler_optimization ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/framework/test_util.py", line 1934, in decorated return f(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/bin/tensorflow/python/eager/small_constants_optimizer_test_cpu.runfiles/org_tensorflow/tensorflow/python/eager/small_constants_optimizer_test.py", line 71, in test_grappler_optimization self.assertLess(opt_benchmark * 3, benchmark) AssertionError: 0.46873682364821434 not less than 0.3665033970028162 ---------------------------------------------------------------------- Ran 6 tests in 211.772s FAILED (failures=1, skipped=2) ================================================================================ //tensorflow/c:c_api_experimental_test PASSED in 24.6s //tensorflow/c:c_api_function_test PASSED in 25.6s //tensorflow/c:c_api_test_cpu PASSED in 27.9s //tensorflow/c:c_test PASSED in 23.0s //tensorflow/c:env_test_cpu PASSED in 18.9s //tensorflow/c:kernels_test_cpu PASSED in 33.9s //tensorflow/c:ops_test PASSED in 19.3s //tensorflow/c:tf_status_helper_test PASSED in 1.0s //tensorflow/c:while_loop_test PASSED in 24.3s //tensorflow/c/eager:c_api_cluster_test_cpu PASSED in 25.7s //tensorflow/c/eager:c_api_remote_function_test_cpu PASSED in 24.8s //tensorflow/c/eager:c_api_remote_test_cpu PASSED in 25.8s //tensorflow/c/eager:c_api_test_cpu PASSED in 30.4s //tensorflow/c/eager:custom_device_test PASSED in 26.3s //tensorflow/c/eager:dlpack_test_cpu PASSED in 23.0s //tensorflow/c/eager/parallel_device:parallel_device_lib_test PASSED in 23.2s //tensorflow/c/eager/parallel_device:parallel_device_remote_test PASSED in 23.2s //tensorflow/c/eager/parallel_device:parallel_device_test PASSED in 22.9s //tensorflow/c/experimental/filesystem/plugins/gcs:expiring_lru_cache_test PASSED in 0.1s //tensorflow/c/experimental/filesystem/plugins/gcs:ram_file_block_cache_test PASSED in 3.5s //tensorflow/c/experimental/grappler:grappler_test PASSED in 22.1s //tensorflow/c/experimental/next_pluggable_device:tensor_pjrt_buffer_util_test PASSED in 6.5s //tensorflow/c/experimental/ops/gen/common:case_format_test PASSED in 0.4s //tensorflow/c/experimental/ops/gen/cpp:cpp_generator_test PASSED in 0.4s //tensorflow/c/experimental/ops/gen/cpp/renderers:renderer_test PASSED in 0.4s //tensorflow/c/experimental/saved_model/core:constant_loading_test PASSED in 10.6s //tensorflow/c/experimental/saved_model/core:object_graph_traversal_test PASSED in 9.7s //tensorflow/c/experimental/saved_model/core:saved_variable_loading_test PASSED in 13.0s //tensorflow/c/experimental/saved_model/core:signature_flattening_test PASSED in 10.0s //tensorflow/c/experimental/saved_model/core:tf_concrete_function_loading_test PASSED in 9.4s //tensorflow/c/experimental/saved_model/core/ops:restore_ops_test PASSED in 11.6s //tensorflow/c/experimental/saved_model/core/ops:variable_ops_test PASSED in 11.6s //tensorflow/c/experimental/saved_model/internal:saved_model_api_test PASSED in 24.0s //tensorflow/c/experimental/stream_executor:stream_executor_test PASSED in 0.6s //tensorflow/c/kernels:bitcast_op_test PASSED in 0.4s //tensorflow/c/kernels:summary_op_benchmark_test PASSED in 0.4s //tensorflow/c/kernels:summary_op_test PASSED in 0.4s //tensorflow/c/kernels:tensor_shape_utils_test PASSED in 0.7s //tensorflow/cc:cc_op_gen_test PASSED in 0.8s //tensorflow/cc:client_client_session_test PASSED in 2.0s //tensorflow/cc:coordinator_test PASSED in 3.9s //tensorflow/cc:framework_cc_ops_test PASSED in 2.2s //tensorflow/cc:framework_gradient_checker_test PASSED in 2.3s //tensorflow/cc:framework_gradients_test PASSED in 4.6s //tensorflow/cc:framework_scope_test PASSED in 0.4s //tensorflow/cc:framework_while_gradients_test PASSED in 2.3s //tensorflow/cc:gradients_array_grad_test PASSED in 4.4s //tensorflow/cc:gradients_data_flow_grad_test PASSED in 1.9s //tensorflow/cc:gradients_functional_grad_test PASSED in 1.9s //tensorflow/cc:gradients_image_grad_test PASSED in 4.9s //tensorflow/cc:gradients_linalg_grad_test PASSED in 2.1s //tensorflow/cc:gradients_manip_grad_test PASSED in 1.9s //tensorflow/cc:gradients_math_grad_test PASSED in 4.2s //tensorflow/cc:gradients_nn_grad_test PASSED in 3.3s //tensorflow/cc:gradients_resource_variable_grad_test PASSED in 1.9s //tensorflow/cc:ops_const_op_test PASSED in 0.4s //tensorflow/cc:ops_while_loop_test PASSED in 2.1s //tensorflow/cc:queue_runner_test PASSED in 12.1s //tensorflow/cc/experimental/base/tests:tensor_test PASSED in 0.1s //tensorflow/cc/experimental/base/tests:tensorhandle_test PASSED in 25.2s //tensorflow/cc/experimental/libexport:load_test PASSED in 0.8s //tensorflow/cc/experimental/libexport:save_test PASSED in 0.1s //tensorflow/cc/experimental/libtf:libtf_module_test PASSED in 23.3s //tensorflow/cc/experimental/libtf:libtf_object_test PASSED in 0.1s //tensorflow/cc/experimental/libtf:libtf_perf_test PASSED in 0.1s //tensorflow/cc/experimental/libtf:libtf_runtime_test PASSED in 27.9s //tensorflow/cc/experimental/libtf:libtf_transform_test PASSED in 23.6s //tensorflow/cc/experimental/libtf:libtf_value_test PASSED in 0.9s //tensorflow/cc/experimental/libtf:libtf_visit_test PASSED in 0.2s //tensorflow/cc/experimental/libtf/impl:iostream_test PASSED in 0.1s //tensorflow/cc/experimental/libtf/impl:none_test PASSED in 0.1s //tensorflow/cc/experimental/libtf/impl:scalars_test PASSED in 0.8s //tensorflow/cc/experimental/libtf/impl:string_test PASSED in 0.7s //tensorflow/cc/experimental/libtf/impl:tensor_spec_test PASSED in 0.7s //tensorflow/cc/saved_model:bundle_v2_test PASSED in 0.1s //tensorflow/cc/saved_model:fingerprinting_chunked_test PASSED in 0.1s //tensorflow/cc/saved_model:fingerprinting_test PASSED in 2.5s //tensorflow/cc/saved_model:fingerprinting_utils_test PASSED in 0.9s //tensorflow/cc/saved_model:metrics_test PASSED in 0.6s //tensorflow/cc/saved_model:reader_test PASSED in 0.6s //tensorflow/cc/saved_model:saved_model_bundle_lite_test PASSED in 4.8s //tensorflow/cc/saved_model:saved_model_bundle_test PASSED in 5.1s //tensorflow/cc/saved_model:util_test PASSED in 0.3s //tensorflow/cc/saved_model/experimental/tests:saved_model_api_test PASSED in 24.4s //tensorflow/cc/tools:freeze_saved_model_test PASSED in 2.1s //tensorflow/compiler/aot:codegen_test PASSED in 22.6s //tensorflow/compiler/jit:compilability_check_util_test PASSED in 14.6s //tensorflow/compiler/jit:deadness_analysis_test PASSED in 7.5s //tensorflow/compiler/jit:device_compilation_cache_test PASSED in 3.9s //tensorflow/compiler/jit:device_compilation_cluster_signature_test PASSED in 4.6s //tensorflow/compiler/jit:device_compilation_profiler_test PASSED in 17.4s //tensorflow/compiler/jit:device_compiler_client_test PASSED in 4.6s //tensorflow/compiler/jit:device_compiler_disable_test PASSED in 14.1s //tensorflow/compiler/jit:device_executable_persistor_test PASSED in 18.2s //tensorflow/compiler/jit:device_util_test PASSED in 3.9s //tensorflow/compiler/jit:encapsulate_util_test PASSED in 1.2s //tensorflow/compiler/jit:node_matchers_test PASSED in 1.0s //tensorflow/compiler/jit:resource_operation_safety_analysis_test PASSED in 7.0s //tensorflow/compiler/jit:shape_inference_test PASSED in 0.9s //tensorflow/compiler/jit:xla_activity_listener_test PASSED in 17.5s //tensorflow/compiler/jit:xla_cluster_util_test PASSED in 7.9s //tensorflow/compiler/jit:xla_compile_util_test PASSED in 4.8s //tensorflow/compiler/jit:xla_kernel_creator_test PASSED in 7.7s //tensorflow/compiler/jit:xla_launch_util_test PASSED in 17.7s //tensorflow/compiler/jit/tests:auto_clustering_test PASSED in 18.6s //tensorflow/compiler/mlir:mlir_graph_optimization_pass_test PASSED in 15.7s //tensorflow/compiler/mlir:register_common_dialects_test PASSED in 13.7s //tensorflow/compiler/mlir/lite:lstm_utils_test PASSED in 0.9s //tensorflow/compiler/mlir/lite:offset_buffer_test PASSED in 0.2s //tensorflow/compiler/mlir/lite:perception_ops_utils_test PASSED in 0.9s //tensorflow/compiler/mlir/lite:size_utils_test PASSED in 0.1s //tensorflow/compiler/mlir/lite:tftext_utils_test PASSED in 1.2s //tensorflow/compiler/mlir/lite/debug:debug_test PASSED in 1.3s //tensorflow/compiler/mlir/lite/experimental/remat:rematerializer_test PASSED in 1.2s //tensorflow/compiler/mlir/lite/experimental/tac:execution_metadata_exporter_test PASSED in 6.1s //tensorflow/compiler/mlir/lite/experimental/tac/tests:compute-cost.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/experimental/tac/tests:device-transform-gpu.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/experimental/tac/tests:device-transform-nnapi.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/experimental/tac/tests:fold-constants-to-subgraph.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/experimental/tac/tests:get-alternative-subgraph.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/experimental/tac/tests:get-op-cost.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/experimental/tac/tests:pick-subgraphs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/experimental/tac/tests:raise-target-subgraphs.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/experimental/tac/tests:tac-filter.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/experimental/tac/tests:target-annotation.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/experimental/tac/tests/e2e:device-transform-nnapi.mlir.test PASSED in 1.3s //tensorflow/compiler/mlir/lite/experimental/tac/tests/e2e:simple-graph.mlir.test PASSED in 4.2s //tensorflow/compiler/mlir/lite/metrics:error_collector_inst_test PASSED in 0.5s //tensorflow/compiler/mlir/lite/quantization:numerical_utils_test PASSED in 0.6s //tensorflow/compiler/mlir/lite/quantization/lite:quantize_model_test PASSED in 9.0s //tensorflow/compiler/mlir/lite/quantization/stablehlo:quantization_test PASSED in 12.1s //tensorflow/compiler/mlir/lite/quantization/tensorflow/tests:fallback_to_flex_ops_default.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/quantization/tensorflow/tests:fallback_to_flex_ops_legacy.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/quantization/tensorflow/tests:tf_to_quant.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/quantization/tensorflow/tests:tf_to_quant_4bit.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/quantization/tests:import_quant_stats.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/sparsity:sparsify_model_test PASSED in 1.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:call_xla_module_to_stablehlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:compose-uniform-quantized-type.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:composite-lowering.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:fold_broadcast.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:fuse_mhlo_convolution.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-inplaceupdate.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-skip-quantization-ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-skip-stateful-partition-calls.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-stablehlo-tfl-composite.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-stablehlo-vhlo.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-add.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-broadcast.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-clamp.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-concat.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-constant.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-conv.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-max.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-mul.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-pad.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-reshape.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-rsqrt.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo-sub.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize-tfl-stablehlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:legalize_hlo.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/stablehlo/tests:odml-to-stablehlo-allow-tf.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/stablehlo/tests:odml-to-stablehlo-smuggle-resize.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/stablehlo/tests:optimize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:stablehlo-custom-call-legalize-composite.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-clamp.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-concat.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-conv.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-division.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-logistic.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-multiply.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo-resize-bilinear.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-serialize-stablehlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tf-tfl-translate-tf-quantize.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tfl_legalize_hlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:tfl_legalize_hlo_custom_call.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:unfold_splat_constant_pass.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:unfuse_mhlo_batch_norm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/stablehlo/tests:uniform-quantized-stablehlo-to-tfl.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:analyze-variables.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:canonicalize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:const-fold.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:decompose-hybrid-quantization.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:default_quant_params.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:dilated-conv.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:fuse-tftext.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:get-arithmetic-count.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:guarantee_func_has_one_use.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:inlining.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:insert_call_once_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:legalize-tensorlist.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:legalize-tf-assert.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:legalize-tf-hashtables.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:legalize-tf-no-runtime-verification.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:legalize-tf-variables.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:legalize-tf-while.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests:legalize-tf.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests:legalize_jax_random.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:lift_tflite_flex_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:lower-static-tensor-list-default-to-single-batch.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:lower-static-tensor-list-enable-dynamic-update-slice.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:lower-static-tensor-list.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:modify_io_nodes.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:ops.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests:optimize-after-quantization.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:optimize.mlir.test PASSED in 1.7s //tensorflow/compiler/mlir/lite/tests:optimize_batch_matmul.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:optimize_functional_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:optimize_no_verify.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:optimize_op_order.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:partitioned-topological-sort.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:pin-ops-with-side-effects.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:post-quantize-dynamic-range.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests:post-quantize.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:prepare-composite-functions-tf.mlir.test PASSED in 1.3s //tensorflow/compiler/mlir/lite/tests:prepare-quantize-dynamic-range.mlir.test PASSED in 1.7s //tensorflow/compiler/mlir/lite/tests:prepare-quantize-post-training-16bits.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:prepare-quantize-post-training.mlir.test PASSED in 1.4s //tensorflow/compiler/mlir/lite/tests:prepare-quantize-signed.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:prepare-quantize.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:prepare-tf-fake-quant-4bit.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:prepare-tf-fake-quant.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests:prepare-tf-with-allowing-bf16-and-f16-type-legalization.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:prepare-tf.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:push-tpose-through-ewise.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:quantize-dynamic-range.mlir.test PASSED in 1.6s //tensorflow/compiler/mlir/lite/tests:quantize-numeric-verify.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:quantize-variables.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests:quantize.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests:raise-custom-ops.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests:reduce-type-precision.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:reduce_while_operands.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:shape-inference.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:split-merged-operands.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests:tfl_while_op_licm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:tfl_while_outline.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests:trim-functions-tf.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests:unfold-large-splat-constant.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/debuginfo:v1_1.0_224_frozen.wrong_attr.line.part.pbtxt.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests/debuginfo:v1_1.0_224_frozen.wrong_attr.stack.part.pbtxt.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests/end2end:add.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:back2back_fake_quant.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:control_flow_v1.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:conv_2d.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:conv_2d_nchw.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:custom_opdef.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:disallow_stateful_partitioned_call.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests/end2end:fake_quant_per_channel.pbtxt.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests/end2end:fake_quant_per_channel_4bit.pbtxt.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests/end2end:fake_quant_without_identity.pbtxt.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests/end2end:fake_quant_without_identity_4bit.pbtxt.test PASSED in 1.0s //tensorflow/compiler/mlir/lite/tests/end2end:graph-input-node.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/end2end:graph_with_placeholder_with_default.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/end2end:if_op.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/end2end:quant_stats.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests/end2end:unroll_batch_matmul.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/end2end:unroll_batch_matmul_disabled.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:basic_lstm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:bucketize.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:cast_bf16.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:constants.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:constants_offset.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:control_edges.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:custom_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:custom_op_offset.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:dynamic_shape.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:empty_input_output_names.json.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:external_constant.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:if_op.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:import_json.json.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:importer_test_min_max.cc.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:importer_test_min_max.cc.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:input_arrays.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:input_output_names_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:legacy_reshape.json.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:lstm.json.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:lstm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:many_attribute_op.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:math.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:matmul.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:mix_tflite_vhlo.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:multi_output_op.json.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:optional.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:optional_input.json.test PASSED in 0.8s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:output_arrays.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:pruning.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:pruning_function_input_as_output.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:quant_stats.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:quantization.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:reshape.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:signature.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:signature_with_multiple_entry_points.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:simple.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:tf_variant_type.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:unranked_function_output.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:unranked_tensor.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:variable.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:vhlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:vhlo_const.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:vhlo_custom_call.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/flatbuffer2mlir:while_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/mlir2exec:tfl_while_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:basic_lstm.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:bucketize.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:cast_bf16.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:custom_op_with_tflite_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:custom_tensorlist_reserve.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:deduplicate_const.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:depthwise_conv2d.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:depthwise_conv2d_v2.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:disable_builtin.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:disable_custom.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:disable_flex.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:disable_flex_enable_builtin.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:dynamic_shape_constant.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:fake_quant.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:flex_exclusively.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:flex_op_with_complex128.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:flex_op_with_f64.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:flex_op_with_tflite_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:fully_connected.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:fully_connected_v2.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:hashtable_resource.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:if_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:logical.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:low_bit_packing.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:lstm.mlir.test PASSED in 2.1s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:lstm_asym_attr.mlir.test PASSED in 2.1s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:lstm_quantized.mlir.test PASSED in 1.7s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:math.mlir.test PASSED in 2.2s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:metadata.mlir.test PASSED in 1.8s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:mul_v2.mlir.test PASSED in 2.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:mul_v3.mlir.test PASSED in 3.4s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:nn.mlir.test PASSED in 2.2s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:numeric_verify.mlir.test PASSED in 2.1s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:optional.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:quantization.mlir.test PASSED in 1.2s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:reshape.mlir.test PASSED in 1.4s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:signature_def.mlir.test PASSED in 1.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:signature_def_output_override.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:signature_def_with_multiple_entry_points.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:signature_def_with_no_inputs.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:simple.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:simple_with_connected_control_nodes.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:simple_with_unconnected_control_nodes.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:svdf.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:svdf_v2.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:tf_entry_function.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:tfl_while_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:transpose_conv_optional.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:type_attr.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:u16_quant.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:unidirectional_sequence_lstm.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:unidirectional_sequence_rnn.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:unranked_tensor.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:unsorted_segment_prod.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:variable.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:variant_type_on_func.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:variant_type_on_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/lite/tests/mlir2flatbuffer:while_op.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/quantization/common:attrs_and_constraints_test PASSED in 6.7s //tensorflow/compiler/mlir/quantization/common:func_test PASSED in 6.1s //tensorflow/compiler/mlir/quantization/common:lift_as_function_call_test PASSED in 6.9s //tensorflow/compiler/mlir/quantization/common:uniform_quantized_types_test PASSED in 6.9s //tensorflow/compiler/mlir/quantization/common/python:testing_test PASSED in 12.5s //tensorflow/compiler/mlir/quantization/common/quantization_lib:quantization_driver_test PASSED in 6.3s //tensorflow/compiler/mlir/quantization/stablehlo:bfloat16_type_test PASSED in 18.2s //tensorflow/compiler/mlir/quantization/stablehlo:convert_tf_quant_to_mhlo_int_test PASSED in 14.6s //tensorflow/compiler/mlir/quantization/stablehlo:convert_tf_quant_types_test PASSED in 13.6s //tensorflow/compiler/mlir/quantization/stablehlo:math_utils_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo:stablehlo_type_utils_test PASSED in 0.5s //tensorflow/compiler/mlir/quantization/stablehlo:tf_type_utils_test PASSED in 17.3s //tensorflow/compiler/mlir/quantization/stablehlo/cc:config_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo/cc:graph_def_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo/cc:io_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo/cc:permutation_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo/cc:pre_calibration_test PASSED in 11.3s //tensorflow/compiler/mlir/quantization/stablehlo/cc:report_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/stablehlo/cc:saved_model_export_test PASSED in 11.2s //tensorflow/compiler/mlir/quantization/stablehlo/cc:saved_model_import_test PASSED in 12.9s //tensorflow/compiler/mlir/quantization/stablehlo/cc/calibration:representative_dataset_test PASSED in 0.2s //tensorflow/compiler/mlir/quantization/stablehlo/ops:stablehlo_op_quant_spec_test PASSED in 6.0s //tensorflow/compiler/mlir/quantization/stablehlo/tests:fill_quantization_options_test PASSED in 2.3s //tensorflow/compiler/mlir/quantization/tensorflow/calibrator:calibration_algorithm_test PASSED in 32.0s //tensorflow/compiler/mlir/quantization/tensorflow/calibrator:calibration_statistics_collector_test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/calibrator:calibrator_singleton_test PASSED in 0.1s //tensorflow/compiler/mlir/quantization/tensorflow/calibrator:custom_aggregator_op_test PASSED in 23.9s //tensorflow/compiler/mlir/quantization/tensorflow/cc:const_op_size_test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/cc:constant_fold_test PASSED in 8.8s //tensorflow/compiler/mlir/quantization/tensorflow/cc:convert_asset_args_test PASSED in 4.6s //tensorflow/compiler/mlir/quantization/tensorflow/cc:save_variables_test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/debugging:mlir_dump_test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/ops:tf_op_quant_spec_test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/ops:tf_quantize_op_test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/python:concurrency_test PASSED in 61.0s //tensorflow/compiler/mlir/quantization/tensorflow/python:py_function_lib_py_test PASSED in 23.8s //tensorflow/compiler/mlir/quantization/tensorflow/python:pywrap_quantize_model_test PASSED in 24.4s //tensorflow/compiler/mlir/quantization/tensorflow/python:representative_dataset_test PASSED in 15.5s //tensorflow/compiler/mlir/quantization/tensorflow/tests:add_dump_tensor_op.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/quantization/tensorflow/tests:add_dump_tensor_op_stablehlo.mlir.test PASSED in 1.2s //tensorflow/compiler/mlir/quantization/tensorflow/tests:add_quantization_unit_loc.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:cast_bf16_ops_to_f32.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:convert_custom_aggregation_op_to_quant_stats.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:convert_fake_quant_to_qdq.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:convert_tf_xla_op_to_tf_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:convert_tpu_model_to_cpu.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:duplicate_shape_determining_constants.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:fake_quant_e2e_flow.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:fake_quant_e2e_xla.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_custom_aggregation_ops.mlir.test PASSED in 1.1s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_main_function.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_quantized_functions.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_quantized_functions_drq.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_quantized_functions_weight_only.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_restore_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:insert_save_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:issue_ids_of_custom_aggregation_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_hashtable_ops_as_args.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_quantizable_spots_as_functions.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_quantizable_spots_as_functions_drq.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_quantizable_spots_as_functions_drq_min_elements.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_quantizable_spots_as_functions_xla.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:lift_quantizable_spots_as_functions_xla_selective_quantization.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:mark_functions_noinline.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:merge_duplicate_resource_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:merge_initializer_function_ops_to_main.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:merge_save_function_ops_to_main.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:optimize.mlir.test PASSED in 2.2s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_lifting.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_quantize.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_quantize_drq.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_quantize_drq_per_channel.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_quantize_ptq.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:prepare_quantize_ptq_per_channel.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:preprocess_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:preprocess_op_weight_only.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:propagate_quantize_type.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_composit_functions_debugging.mlir.test PASSED in 4.1s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_composite_functions.mlir.test PASSED in 1.2s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_composite_functions_drq.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_composite_functions_weight_only.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_composite_functions_xla.mlir.test PASSED in 2.1s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_drq.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_weights.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/quantization/tensorflow/tests:quantize_xla.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:remove_var_init_by_const.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:replace_cast_hacks_with_tf_xla_ops.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/quantization/tensorflow/tests:replace_cast_hacks_with_tf_xla_ops_large_constants.mlir.test PASSED in 10.6s //tensorflow/compiler/mlir/quantization/tensorflow/tests:unfreeze_constants.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/quantization/tensorflow/utils:tf_to_uniform_attribute_utils_test PASSED in 1.0s //tensorflow/compiler/mlir/quantization/tensorflow/utils:tf_to_xla_attribute_utils_test PASSED in 27.0s //tensorflow/compiler/mlir/stablehlo:stablehlo_test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow:bridge_logger_test PASSED in 4.8s //tensorflow/compiler/mlir/tensorflow:call_graph_util_test PASSED in 0.3s //tensorflow/compiler/mlir/tensorflow:cluster_util_test PASSED in 1.3s //tensorflow/compiler/mlir/tensorflow:convert_tensor_test PASSED in 0.5s //tensorflow/compiler/mlir/tensorflow:convert_type_test PASSED in 1.1s //tensorflow/compiler/mlir/tensorflow:data_dumper_logger_config_test PASSED in 5.6s //tensorflow/compiler/mlir/tensorflow:device_util_test PASSED in 0.3s //tensorflow/compiler/mlir/tensorflow:dump_graph_test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow:dump_mlir_util_test PASSED in 10.9s //tensorflow/compiler/mlir/tensorflow:error_util_test PASSED in 0.1s //tensorflow/compiler/mlir/tensorflow:tf_saved_model_test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow:tpu_rewrite_device_util_test PASSED in 0.5s //tensorflow/compiler/mlir/tensorflow:xla_rewrite_util_test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:add_functions_for_exported_names.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:annotate-parameter-replication.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:batchmatmul_to_einsum.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:breakup-islands.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:cannonicalize_ops_outside_compilation.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:canonicalize.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:canonicalize_compile_and_replicate_attributes.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:check_control_dependencies.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:cluster_formation.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:cluster_ops_by_policy.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:cluster_outlining.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:cluster_tf_ops_pass.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:colocate_tpu_copy_with_dynamic_shape.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:constant-fold.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:constant_op_device_assignment.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:convert-tf-control-flow-to-scf.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:convert_control_to_data_outputs.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:convert_launch_func_to_tf_call.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:convert_session_initializer_to_function.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:convert_to_legacy_compile_and_replicate_attributes.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:decompose_reduce_dataset.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:decompose_resource_ops.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:device_assignment.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:device_assignment_by_func_attr.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:device_attribute_to_launch.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:device_canonicalize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:device_copy.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:drop_while_shape_invariant.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:einsum.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:embedding_pipelining.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:embedding_program_key.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:embedding_sequencing.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:empty-main.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:end-to-end-tpu-reshard-variables.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:executor_canonicalize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:executor_island_coarsening.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:executor_island_materialize_const.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:extract_head_tail_outside_compilation.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:extract_outside_compilation.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:extract_tpu_copy_with_dynamic_shape_op.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:fold-broadcast.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:freeze_variables.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:func-attr-invalid.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:func-attr.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:functional-control-flow-to-cfg.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:functional-control-flow-to-regions.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:functionalize-if-fail.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:functionalize-if.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:fused_kernel_matcher.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:gpu_fusion.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:graph_pruning.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:graph_pruning_preserve_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:group_by_dialect.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:guarantee-all-funcs-one-use.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:hoist_broadcast_read.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:hoist_loop_invariant.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:hoist_replicate_invariant_resource_writes.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:init_text_file_to_import.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:init_text_file_to_import_invalid.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:init_text_file_to_import_saved_model.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:inlining.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:isolate-placer.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:launch_outlining.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:launch_to_device_attribute.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:launch_to_device_attribute_legacy.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_layout_assignment_gpu_cc_60.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_layout_assignment_gpu_cc_70.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_layout_assignment_to_nchw.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_layout_assignment_to_nhwc.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_move_transposes_begin.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_move_transposes_end.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_to_nchw.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:layout_optimization_to_nhwc.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:legalize_tfg.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:legalize_tfg_arg_control_dep.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:legalize_tfg_with_control_flow.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:localize_var_handles.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:lower_globals_to_ml_program.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:lower_globals_to_ml_program_invalid.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:lower_quantized.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:lower_tf.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:lower_variable_ops_to_ml_program.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:mark_input_output_aliases.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:mark_ops_for_outside_compilation.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:materialize_passthrough_op.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:merge_control_flow.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:mlprogram.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:move_tpu_compile_to_front.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:name_anonymous_iterators.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:optimize-arg-operand-constraint.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:optimize.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:order_by_dialect.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:parallel_execute_to_islands.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:parallel_execute_to_islands_legacy.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:prepare_tpu_computation_for_tf_export.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:print.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:promote_resources_to_args.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:promote_resources_to_args_functions.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:promote_var_handles_to_args.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:readonly_references_to_resources.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:region-control-flow-to-functional.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:remove_unused_arguments.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:remove_unused_while_results.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:replica_id_to_device_ordinal.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:replicate_invariant_op_hoisting.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:replicate_tensor_list_init_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:replicate_to_island.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:replicate_to_island_legacy.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:resource-alias-analysis-test.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:resource-device-inference.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:resource_analyzer.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:resource_inlining.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:resource_op_lifting.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:rewrite_tpu_embedding_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:roundtrip-tf-executor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:shape_inference.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:side-effect-analysis-test.mlir.test PASSED in 1.3s //tensorflow/compiler/mlir/tensorflow/tests:sink_constant.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:split_into_island_per_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:stack_ops_decomposition.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:strip_noinline.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:strip_saved_module_metadata.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:strip_tf_attributes.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tensor_array_ops_decomposition.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:tensor_list_ops_decomposition.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf-executor-to-functional.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf-functional-to-executor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf-ops.mlir.test PASSED in 2.3s //tensorflow/compiler/mlir/tensorflow/tests:tf-reduce-identity.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_data_fuse_map_and_batch.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_data_fuse_pmap_and_batch.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_device_index_selector.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_device_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_device_ops_invalid.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_executor_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_executor_ops_invalid.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_executor_ops_location_roundtrip.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_executor_ops_printer.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_executor_ops_side_effect.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_optimize.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_asset_sinking.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_deduplicate_bound_input_bindings.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_freeze_assets.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_freeze_global_tensors.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_freeze_global_tensors_mutable_tensors.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_initialize_variables_in_session_init.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_initialize_variables_in_session_init_fail.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_lift_variables.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_lift_variables_invalid_session.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_mark_initialized_variables.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_ops_invalid.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_optimize_global_tensors.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_optimize_global_tensors_interprocedural.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tf_saved_model_remove_vars_in_session_initializer.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_side_effect.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tf_trait_folds.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tfrt_ops.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu-annotate-dynamic-shape-inputs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tpu-cluster-cleanup-attributes.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tpu-dynamic-layout-pass.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu-merge-variables-with-execute.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu-multiple-while-body-func.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu-resource-read-for-write.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tpu-variable-runtime-reformatting.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_cluster_formation.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tensorflow/tests:tpu_colocate_composite_resource_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tpu_colocate_splits.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tpu_device_propagation.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tpu_host_computation_expansion.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tpu_identity_pruning.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_parallel_execute_sink_resource_write.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:tpu_partitioned_op_conversion.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_reorder_replicate_and_partitioned_inputs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:tpu_resource_partitioning.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_rewrite.mlir.test PASSED in 1.4s //tensorflow/compiler/mlir/tensorflow/tests:tpu_sharding_identification.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_space_to_depth_pass.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:tpu_tail_with_tobool_op.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_update_embedding_enqueue_op_inputs.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:tpu_validate_inputs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:transpose-op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:unroll-batch-matmul.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:update_control_dependencies.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:verify_for_export.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:warn_when_using_deprecated_dumps.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:while_licm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:xla_broadcast.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:xla_call_module_deserialization.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:xla_call_module_round_trip.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:xla_call_module_serialization.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests:xla_cluster_formation.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests:xla_inline_device_ops.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests:xla_outline_entry_functions.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:xla_rewrite.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:xla_rewrite_v2.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests:xla_sharding_util_test PASSED in 0.3s //tensorflow/compiler/mlir/tensorflow/tests:xla_validate_iputs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:add.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:argument-sharding-invalid.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:argument-sharding.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:constant-folding-hook.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:constant-folding.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:convert_mhlo_quant_to_int.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:graph-resource.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:graph-resource.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:graph.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:mlir-module-serialized-str-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:replicate-tensor-list-init-ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:result-sharding.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:serialized-mlir-module-str-attr-invalid.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:serialized-mlir-module-str-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:shape-inference-after-legalization.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:shape-inference.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests/compile_mlir_util:stablehlo_add.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_island_coarsening:executor_tpuv1_island_coarsening.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_island_coarsening:while_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_island_inlining:executor_tpuv1_inline_tpu_island.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_island_inlining:while_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_outline_island:case_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_outline_island:executor_tpuv1_outline_tpu_island.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/executor_tpuv1_outline_island:while_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:add.pbtxt.test PASSED in 1.1s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-as-fetch.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-control-dep.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-data-type-with-subtype.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-data-type.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-multi-data-type-with-subtype.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:arg-retval-attrs.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:case_op.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:const-values.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:device-arg-retval-attr.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:empty-input-shapes.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:empty-value-attr.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:feed-as-fetch.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:feed-control-dep.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:force_shared_name_for_resource_ops.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:function-func-attr.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:functional-if-ops.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:functional-while-ops.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-as-function-control-ret.pbtxt.test PASSED in 0.9s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-as-function-retval-of-arg.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-as-function.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-custom-operation.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-default-attr.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-device-retval.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-empty-tensor-content.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-func-attr.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-call.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-control-ret-diff-island.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-control-ret-same-island.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-defs.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-input-shapes.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-name-bug.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-function-resource-args.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-gradient-def.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-input-func-arg-name-collision.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-library.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-malformed.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-scalar-input.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-uint8-return.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-undefined-output.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-version-info.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:graph-while-loop.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:invalid-output-index.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:legacy-fed-input-without-inputs.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:merge_node_with_function.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:mlir_passthrough_op.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:multi-output-feeds.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:multiple-use-next-iteration.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:node-locations.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:output-shapes-attr.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:output-shapes.pbtxt.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:parse_example.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:parse_example_v2.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:partial-device-name.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:prune_unused_nodes.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:quint8-const.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:shape-attrs.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:stateful-attribute.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:string-attr.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:switch_n.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:target.pbtxt.test PASSED in 0.8s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:tensor-list.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:tf-data-pipeline.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir:unregistered_kernel.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/graphdef2mlir/batch_use_same_function:saved_model.pbtxt.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graph:convert_tensor.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:aliasing_arg_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:case.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:convert_tensor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:derived_shape_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:derived_size_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:device-arg-retval-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:export_main_to_flib.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:fetch_feed_names.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:func_attr.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:func_list_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:function-control-ret.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:function-order.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:function-resource-args-handle-info.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:function-resource-args.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:functional-if-ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:functional-while-ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:graph-as-function.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:infer_derived_attribute.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:invalid_input.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:legalized_name.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:missing-main.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:noop.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:optional_symbol_ref.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:output-shapes-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:parse_example.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:parse_example_v2.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:preserve-entry-func-names.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:ref-type-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:ref-while-loop.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:shape_list_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:simple.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:simple_tf_dialect_op.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:stringescape.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:switchn.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:tf-gradient-attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:tf-legacy-call.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:tf_add.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:tf_identity_n.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:tf_tpu_embedding_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:type_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:type_list_attr.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:unique_name.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:unique_output_name.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/mlir2graphdef:while-loop.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tensorflow/tests/tf_to_hlo_pipeline:sccp-post-shape-inference.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tensorflow/transforms:verify_no_outside_compilation_markers_pass_test PASSED in 11.8s //tensorflow/compiler/mlir/tensorflow/transforms/host_runtime:lower_cluster_to_runtime_ops_test PASSED in 10.0s //tensorflow/compiler/mlir/tensorflow/transforms/host_runtime:tpu_metadata_utils_test PASSED in 9.8s //tensorflow/compiler/mlir/tensorflow/translate:tf_mlir_translate_registration_test PASSED in 12.6s //tensorflow/compiler/mlir/tf2xla/api/v1:cluster_tf_test PASSED in 19.1s //tensorflow/compiler/mlir/tf2xla/api/v1:compile_mlir_util_test PASSED in 4.7s //tensorflow/compiler/mlir/tf2xla/api/v1:compile_tf_graph_test PASSED in 0.3s //tensorflow/compiler/mlir/tf2xla/api/v1:tf_dialect_to_executor_test PASSED in 13.4s //tensorflow/compiler/mlir/tf2xla/api/v2:cluster_tf_test PASSED in 19.0s //tensorflow/compiler/mlir/tf2xla/api/v2:legalize_tf_test PASSED in 15.7s //tensorflow/compiler/mlir/tf2xla/api/v2:tf_dialect_to_executor_test PASSED in 13.4s //tensorflow/compiler/mlir/tf2xla/internal:clustering_bridge_passes_test PASSED in 5.4s //tensorflow/compiler/mlir/tf2xla/internal:compilation_timer_test PASSED in 0.2s //tensorflow/compiler/mlir/tf2xla/internal:legalize_tf_mlir_test PASSED in 14.9s //tensorflow/compiler/mlir/tf2xla/internal:legalize_tf_to_hlo_test PASSED in 16.8s //tensorflow/compiler/mlir/tf2xla/internal:logging_hooks_test PASSED in 15.3s //tensorflow/compiler/mlir/tf2xla/internal:mlir_bridge_pass_util_test PASSED in 0.6s //tensorflow/compiler/mlir/tf2xla/internal:mlir_pass_instrumentation_test PASSED in 5.4s //tensorflow/compiler/mlir/tf2xla/internal:test_matchers_test PASSED in 3.9s //tensorflow/compiler/mlir/tf2xla/internal/inference:inference_metrics_pass_test PASSED in 11.6s //tensorflow/compiler/mlir/tf2xla/internal/passes:input_metrics_lowering_pass_test PASSED in 11.7s //tensorflow/compiler/mlir/tf2xla/internal/passes:tpu_cluster_formation_test PASSED in 11.6s //tensorflow/compiler/mlir/tf2xla/internal/passes:verify_clustering_pass_test PASSED in 11.6s //tensorflow/compiler/mlir/tf2xla/internal/passes:verify_clustering_pass_test.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/internal/passes:verify_input_dialect_to_executor_pass_test.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/internal/utils:dialect_detection_utils_test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/tests:adjust-layout.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tf2xla/tests:hlo_xla_runtime_pipeline.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-BatchMatMulV2.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-binary-elementwise.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-collective.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-communication.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-include-tf2xla-fallback.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-prefer-tf2xla.mlir.test PASSED in 0.9s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-quant.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf-with-tf2xla-hlo-importer.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tf2xla/tests:legalize-tf.mlir.test PASSED in 10.0s //tensorflow/compiler/mlir/tf2xla/tests:tfxla_device_specific_transformations_cpu.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tf2xla/tests:tfxla_device_specific_transformations_gpu.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tf2xla/tests:verify-tfxla-legalization-no-chlo.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tf2xla/tests:verify-tfxla-legalization.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tf2xla/transforms:legalization_op_config_test PASSED in 22.9s //tensorflow/compiler/mlir/tf2xla/transforms:tf2xla_rewriter_test PASSED in 12.2s //tensorflow/compiler/mlir/tf2xla/transforms:verify_tfxla_legalization_test PASSED in 11.7s //tensorflow/compiler/mlir/tf2xla/transforms:xla_legalize_targets_test PASSED in 0.7s //tensorflow/compiler/mlir/tf2xla/transforms:xla_legalize_tf_test PASSED in 3.5s //tensorflow/compiler/mlir/tfr:graph_decompose_test PASSED in 13.2s //tensorflow/compiler/mlir/tfr:node_expansion_test PASSED in 11.2s //tensorflow/compiler/mlir/tfr:op_reg_gen_test PASSED in 116.0s //tensorflow/compiler/mlir/tfr:tfr_decompose_ctx_test PASSED in 5.4s //tensorflow/compiler/mlir/tfr:tfr_gen_test PASSED in 124.2s //tensorflow/compiler/mlir/tfr/examples/customization:test_ops_test PASSED in 30.7s //tensorflow/compiler/mlir/tfr/examples/mnist:mnist_ops_test PASSED in 24.9s //tensorflow/compiler/mlir/tfr/examples/pad:pad_ops_test PASSED in 25.0s //tensorflow/compiler/mlir/tfrt/tests:batch_function_fallback_resource_variable_as_captured_tensor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests:batch_function_lowering.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:convert_ref_variables.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:cross_device_transfer.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:deduplicate_if_results.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tfrt/tests:fuse_tpu_compile_and_execute_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:hoist_invariant_ops.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tfrt/tests:hoist_invariant_ops_mlrt.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:optimize.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests:remove_device_attribute.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests:runtime_lowering_gpu.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests:runtime_lowering_tpu.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:sink_in_invariant_ops.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:xla_launch_fallback.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tfrt/tests:xla_launch_lowering.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests:xla_rewrite.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/analysis:cost_analysis.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/analysis:tensor_array_side_effect_analysis.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/analysis:update_op_cost_in_tfrt_mlir_test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:lower_to_ifrt_restore_variable.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:rewrite_cluster_to_ifrt_call.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/ifrt:sink_variable_as_named_array.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:tf_identity_propagation.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:tf_restore_merging.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:tf_restore_pruning.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ifrt:tf_restore_splitting.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/ir:fallback_opt.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/ir:tfrt_fallback_util_test PASSED in 0.9s //tensorflow/compiler/mlir/tfrt/tests/mlrt:assign_op_key.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/mlrt:async_while.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/mlrt:fuse_mlrt_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/mlrt:inline.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/mlrt:parallelization.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/mlrt:tf_to_mlrt.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/mlrt:tpu_conversions.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/mlrt:while_to_map_fn.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:attributes.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:basic.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:batch_function_deduplicate.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:batch_function_deduplicate_failed.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:const_tensor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:control_flow.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:decompose_resource_op.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:derived_attrs.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:device_conversion.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:errors.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:fallback.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:fallback_canonicalization.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:fallback_inline.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:func_attributes.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:func_attributes_multiple_callers.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:func_use_fallback_tensor.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:insert_fallback_tensor_copy.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:merge_tf_if_ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:optimize_tf_control_flow_side_effect.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:remove_tf_if_const_args.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:reorder_assert.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:side_effects.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:tf_to_corert_pipeline.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:tf_to_corert_pipeline_refvar.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/tests/tf_to_corert:whileop.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tfrt/translate/mlrt:mlir_to_bytecode_test PASSED in 0.1s //tensorflow/compiler/mlir/tools/kernel_gen/tests:buffer_deallocation.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/tools/kernel_gen/tests:buffer_reuse.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:bufferize.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tools/kernel_gen/tests:copy_cleanup.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tools/kernel_gen/tests:embed_tf_framework.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:func_to_jit_invocations.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:invalid.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/tools/kernel_gen/tests:isinf.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:ops.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:parallel_loops_to_sequential.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/tools/kernel_gen/tests:rewrite_tf_framework_assert.mlir.test PASSED in 0.5s //tensorflow/compiler/mlir/tools/kernel_gen/tests:tf_abi_knowledge.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:tf_framework_legalize_to_llvm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tools/kernel_gen/tests:tf_kernel_gpu_launch_to_llvm.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tosa/tests:convert-tfl-uint8.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:convert_metadata.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tosa/tests:fuse-bias-tf.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:lower-complex-types.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:multi_add.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tosa/tests:retain_call_once_funcs.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:strip-quant-types.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tosa/tests:strip_metadata.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:tf-tfl-to-tosa-pipeline.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:tf-to-tosa-pipeline.mlir.test PASSED in 1.0s //tensorflow/compiler/mlir/tosa/tests:tfl-to-tosa-dequantize_softmax.mlir.test PASSED in 0.6s //tensorflow/compiler/mlir/tosa/tests:tfl-to-tosa-pipeline-filtered.mlir.test PASSED in 0.7s //tensorflow/compiler/mlir/tosa/tests:tfl-to-tosa-pipeline.mlir.test PASSED in 5.6s //tensorflow/compiler/mlir/tosa/tests:tfl-to-tosa-stateful.mlir.test PASSED in 0.8s //tensorflow/compiler/mlir/tosa/tests:verify_fully_converted.mlir.test PASSED in 0.7s //tensorflow/compiler/tests:adadelta_test_cpu PASSED in 15.0s //tensorflow/compiler/tests:adagrad_da_test_cpu PASSED in 12.8s //tensorflow/compiler/tests:adagrad_test_cpu PASSED in 11.7s //tensorflow/compiler/tests:adam_test_cpu PASSED in 14.8s //tensorflow/compiler/tests:add_n_test_cpu PASSED in 10.7s //tensorflow/compiler/tests:argminmax_test_cpu PASSED in 19.8s //tensorflow/compiler/tests:argminmax_test_cpu_mlir_bridge_test PASSED in 22.0s //tensorflow/compiler/tests:async_comp_test_cpu PASSED in 10.0s //tensorflow/compiler/tests:bincount_op_test_cpu PASSED in 10.4s //tensorflow/compiler/tests:bucketize_op_test_cpu PASSED in 11.5s //tensorflow/compiler/tests:bucketize_op_test_cpu_mlir_bridge_test PASSED in 10.8s //tensorflow/compiler/tests:case_test_cpu PASSED in 12.2s //tensorflow/compiler/tests:cast_ops_test_cpu PASSED in 10.9s //tensorflow/compiler/tests:cast_ops_test_cpu_mlir_bridge_test PASSED in 11.4s //tensorflow/compiler/tests:categorical_op_test_cpu PASSED in 33.2s //tensorflow/compiler/tests:categorical_op_test_cpu_mlir_bridge_test PASSED in 33.4s //tensorflow/compiler/tests:cholesky_op_test_cpu PASSED in 59.8s //tensorflow/compiler/tests:cholesky_op_test_cpu_mlir_bridge_test PASSED in 51.2s //tensorflow/compiler/tests:clustering_test_cpu PASSED in 80.1s //tensorflow/compiler/tests:clustering_test_cpu_mlir_bridge_test PASSED in 64.4s //tensorflow/compiler/tests:concat_ops_test_cpu PASSED in 79.5s //tensorflow/compiler/tests:concat_ops_test_cpu_mlir_bridge_test PASSED in 80.5s //tensorflow/compiler/tests:cond_test_cpu PASSED in 74.5s //tensorflow/compiler/tests:const_arg_test_cpu PASSED in 72.5s //tensorflow/compiler/tests:const_test_cpu PASSED in 11.6s //tensorflow/compiler/tests:data_format_ops_test_cpu PASSED in 12.2s //tensorflow/compiler/tests:data_format_ops_test_cpu_mlir_bridge_test PASSED in 13.3s //tensorflow/compiler/tests:dense_layer_test_cpu PASSED in 20.3s //tensorflow/compiler/tests:dynamic_slice_ops_test_cpu PASSED in 15.2s //tensorflow/compiler/tests:dynamic_slice_ops_test_cpu_mlir_bridge_test PASSED in 13.5s //tensorflow/compiler/tests:dynamic_stitch_test_cpu PASSED in 9.7s //tensorflow/compiler/tests:dynamic_stitch_test_cpu_mlir_bridge_test PASSED in 10.3s //tensorflow/compiler/tests:eager_test_cpu PASSED in 23.3s //tensorflow/compiler/tests:einsum_op_test_cpu PASSED in 11.9s //tensorflow/compiler/tests:einsum_op_test_cpu_mlir_bridge_test PASSED in 9.8s //tensorflow/compiler/tests:ensure_shape_op_test_cpu PASSED in 9.3s //tensorflow/compiler/tests:extract_image_patches_op_test_cpu PASSED in 20.2s //tensorflow/compiler/tests:extract_image_patches_op_test_cpu_mlir_bridge_test PASSED in 10.6s //tensorflow/compiler/tests:fake_quant_ops_test_cpu PASSED in 20.1s //tensorflow/compiler/tests:fake_quant_ops_test_cpu_mlir_bridge_test PASSED in 17.5s //tensorflow/compiler/tests:fifo_queue_test_cpu PASSED in 14.0s //tensorflow/compiler/tests:fifo_queue_test_cpu_mlir_bridge_test PASSED in 13.7s //tensorflow/compiler/tests:ftrl_ops_test_cpu PASSED in 10.9s //tensorflow/compiler/tests:ftrl_ops_test_cpu_mlir_bridge_test PASSED in 11.7s //tensorflow/compiler/tests:function_test_cpu PASSED in 9.4s //tensorflow/compiler/tests:function_test_cpu_mlir_bridge_test PASSED in 10.6s //tensorflow/compiler/tests:gather_nd_op_test_cpu PASSED in 9.8s //tensorflow/compiler/tests:gather_nd_op_test_cpu_mlir_bridge_test PASSED in 11.1s //tensorflow/compiler/tests:gather_test_cpu PASSED in 94.8s //tensorflow/compiler/tests:gather_test_cpu_mlir_bridge_test PASSED in 44.8s //tensorflow/compiler/tests:image_ops_jit_compile_test_cpu PASSED in 15.3s //tensorflow/compiler/tests:jit_test_cpu PASSED in 56.1s //tensorflow/compiler/tests:listdiff_op_test_cpu PASSED in 11.7s //tensorflow/compiler/tests:listdiff_op_test_cpu_mlir_bridge_test PASSED in 42.4s //tensorflow/compiler/tests:lrn_ops_test_cpu PASSED in 9.7s //tensorflow/compiler/tests:lrn_ops_test_cpu_mlir_bridge_test PASSED in 12.1s //tensorflow/compiler/tests:lstm_test_cpu PASSED in 31.9s //tensorflow/compiler/tests:manip_ops_test_cpu PASSED in 14.3s //tensorflow/compiler/tests:manip_ops_test_cpu_mlir_bridge_test PASSED in 12.5s //tensorflow/compiler/tests:matrix_inverse_op_test_cpu PASSED in 21.1s //tensorflow/compiler/tests:matrix_inverse_op_test_cpu_mlir_bridge_test PASSED in 18.8s //tensorflow/compiler/tests:matrix_solve_op_test_cpu PASSED in 13.2s //tensorflow/compiler/tests:matrix_solve_op_test_cpu_mlir_bridge_test PASSED in 12.4s //tensorflow/compiler/tests:momentum_test_cpu PASSED in 12.0s //tensorflow/compiler/tests:nary_ops_test_cpu PASSED in 10.7s //tensorflow/compiler/tests:nary_ops_test_cpu_mlir_bridge_test PASSED in 38.1s //tensorflow/compiler/tests:nullary_ops_test_cpu PASSED in 10.4s //tensorflow/compiler/tests:nullary_ops_test_cpu_mlir_bridge_test PASSED in 12.8s //tensorflow/compiler/tests:placeholder_test_cpu PASSED in 10.3s //tensorflow/compiler/tests:placeholder_test_cpu_mlir_bridge_test PASSED in 9.4s //tensorflow/compiler/tests:proximal_adagrad_test_cpu PASSED in 11.1s //tensorflow/compiler/tests:proximal_gradient_descent_test_cpu PASSED in 10.8s //tensorflow/compiler/tests:quantized_ops_test_cpu PASSED in 10.1s //tensorflow/compiler/tests:reduce_window_test_cpu PASSED in 11.8s //tensorflow/compiler/tests:reduce_window_test_cpu_mlir_bridge_test PASSED in 10.0s //tensorflow/compiler/tests:repeat_op_test_cpu PASSED in 13.8s //tensorflow/compiler/tests:repeat_op_test_cpu_mlir_bridge_test PASSED in 12.8s //tensorflow/compiler/tests:reshape_op_test_cpu PASSED in 10.0s //tensorflow/compiler/tests:reshape_op_test_cpu_mlir_bridge_test PASSED in 22.7s //tensorflow/compiler/tests:reverse_ops_test_cpu PASSED in 14.2s //tensorflow/compiler/tests:reverse_ops_test_cpu_mlir_bridge_test PASSED in 12.1s //tensorflow/compiler/tests:reverse_sequence_op_test_cpu PASSED in 14.1s //tensorflow/compiler/tests:reverse_sequence_op_test_cpu_mlir_bridge_test PASSED in 10.1s //tensorflow/compiler/tests:rmsprop_test_cpu PASSED in 12.9s //tensorflow/compiler/tests:scatter_nd_op_test_cpu PASSED in 21.3s //tensorflow/compiler/tests:scatter_nd_op_test_cpu_mlir_bridge_test PASSED in 21.6s //tensorflow/compiler/tests:searchsorted_op_test_cpu PASSED in 12.4s //tensorflow/compiler/tests:searchsorted_op_test_cpu_mlir_bridge_test PASSED in 13.5s //tensorflow/compiler/tests:segment_reduction_ops_test_cpu PASSED in 35.7s //tensorflow/compiler/tests:segment_reduction_ops_test_cpu_mlir_bridge_test PASSED in 22.0s //tensorflow/compiler/tests:self_adjoint_eig_op_test_cpu PASSED in 34.1s //tensorflow/compiler/tests:self_adjoint_eig_op_test_cpu_mlir_bridge_test PASSED in 15.8s //tensorflow/compiler/tests:slice_ops_test_cpu PASSED in 16.4s //tensorflow/compiler/tests:slice_ops_test_cpu_mlir_bridge_test PASSED in 22.8s //tensorflow/compiler/tests:sparse_to_dense_op_test_cpu PASSED in 10.0s //tensorflow/compiler/tests:sparse_to_dense_op_test_cpu_mlir_bridge_test PASSED in 10.1s //tensorflow/compiler/tests:stack_ops_test_cpu PASSED in 12.5s //tensorflow/compiler/tests:tensor_float_32_test_cpu PASSED in 14.8s //tensorflow/compiler/tests:tensor_float_32_test_cpu_mlir_bridge_test PASSED in 13.2s //tensorflow/compiler/tests:tensor_list_ops_test_cpu PASSED in 23.3s //tensorflow/compiler/tests:tridiagonal_matmul_ops_test_cpu PASSED in 22.5s //tensorflow/compiler/tests:tridiagonal_matmul_ops_test_cpu_mlir_bridge_test PASSED in 15.5s //tensorflow/compiler/tests:tridiagonal_solve_ops_test_cpu PASSED in 13.8s //tensorflow/compiler/tests:tridiagonal_solve_ops_test_cpu_mlir_bridge_test PASSED in 14.3s //tensorflow/compiler/tests:unique_ops_test_cpu PASSED in 10.8s //tensorflow/compiler/tests:variable_ops_test_cpu PASSED in 23.1s //tensorflow/compiler/tests:variable_ops_test_cpu_mlir_bridge_test PASSED in 14.8s //tensorflow/compiler/tests:where_op_test_cpu PASSED in 10.4s //tensorflow/compiler/tests:while_test_cpu PASSED in 11.0s //tensorflow/compiler/tests:xla_call_module_no_platform_check_test_cpu PASSED in 16.2s //tensorflow/compiler/tests:xla_call_module_no_shape_assertions_check_test_cpu PASSED in 12.8s //tensorflow/compiler/tests:xla_call_module_test_cpu PASSED in 17.1s //tensorflow/compiler/tests:xla_custom_call_ops_test_cpu PASSED in 9.8s //tensorflow/compiler/tests:xla_device_gpu_test_cpu PASSED in 11.0s //tensorflow/compiler/tests:xla_device_test_cpu PASSED in 21.0s //tensorflow/compiler/tests:xla_device_test_cpu_mlir_bridge_test PASSED in 16.9s //tensorflow/compiler/tests:xla_dump_to_test_cpu PASSED in 9.4s //tensorflow/compiler/tests:xla_dump_to_test_cpu_mlir_bridge_test PASSED in 9.6s //tensorflow/compiler/tests:xla_ops_test_cpu PASSED in 31.4s //tensorflow/compiler/tests:xla_ops_test_cpu_mlir_bridge_test PASSED in 31.2s //tensorflow/compiler/tests:xla_test_test PASSED in 8.9s //tensorflow/compiler/tf2xla:const_analysis_test PASSED in 4.2s //tensorflow/compiler/tf2xla:cpu_function_runtime_test PASSED in 0.1s //tensorflow/compiler/tf2xla:functionalize_cond_test PASSED in 0.7s //tensorflow/compiler/tf2xla:functionalize_control_flow_test PASSED in 0.9s //tensorflow/compiler/tf2xla:fused_batchnorm_reserve_space_test_cpu PASSED in 18.0s //tensorflow/compiler/tf2xla:graph_compiler_test PASSED in 3.8s //tensorflow/compiler/tf2xla:literal_util_test PASSED in 0.5s //tensorflow/compiler/tf2xla:resource_operation_table_test PASSED in 4.2s //tensorflow/compiler/tf2xla:resource_util_test_cpu PASSED in 1.8s //tensorflow/compiler/tf2xla:sharding_util_test PASSED in 0.8s //tensorflow/compiler/tf2xla:tf2xla_opset_test PASSED in 6.9s //tensorflow/compiler/tf2xla:tf2xla_test PASSED in 12.7s //tensorflow/compiler/tf2xla:tf2xla_util_test PASSED in 0.8s //tensorflow/compiler/tf2xla:type_util_test PASSED in 0.7s //tensorflow/compiler/tf2xla:xla_compiler_test PASSED in 15.8s //tensorflow/compiler/tf2xla:xla_jit_compiled_cpu_function_test PASSED in 12.4s //tensorflow/compiler/tf2xla:xla_op_registry_test PASSED in 3.8s //tensorflow/compiler/tf2xla/kernels:rng_converter_utils_test PASSED in 1.2s //tensorflow/core:@local_tsl__tsl_lib_core_legacy_lib_core_all_tests PASSED in 0.8s //tensorflow/core:__tensorflow_core_lib_core_legacy_lib_core_all_tests PASSED in 6.4s //tensorflow/core:__tensorflow_core_lib_gtl_legacy_lib_gtl_tests PASSED in 0.7s //tensorflow/core:__tensorflow_core_lib_monitoring_cell_reader_test PASSED in 28.9s //tensorflow/core:__tensorflow_core_lib_monitoring_collection_registry_test PASSED in 0.1s //tensorflow/core:__tensorflow_core_lib_monitoring_counter_test PASSED in 0.1s //tensorflow/core:__tensorflow_core_lib_monitoring_gauge_test PASSED in 0.2s //tensorflow/core:__tensorflow_core_lib_monitoring_metric_def_test PASSED in 0.1s //tensorflow/core:__tensorflow_core_lib_monitoring_percentile_sampler_test PASSED in 0.1s //tensorflow/core:__tensorflow_core_lib_monitoring_sampler_test PASSED in 1.1s //tensorflow/core:__tensorflow_core_lib_monitoring_test_utils_test PASSED in 0.2s //tensorflow/core:__tensorflow_core_lib_strings_legacy_low_level_library_tests PASSED in 0.7s //tensorflow/core:__tensorflow_core_lib_wav_wav_io_test PASSED in 0.1s //tensorflow/core:__tensorflow_core_util_mkl_util_test_srcs PASSED in 0.1s //tensorflow/core:lib_strings_ordered_code_test PASSED in 1.8s //tensorflow/core:lib_strings_proto_serialization_test PASSED in 0.8s //tensorflow/core/api_def:api_test PASSED in 6.3s //tensorflow/core/api_def:update_api_def_test PASSED in 0.2s //tensorflow/core/common_runtime:all_to_all_test_cpu PASSED in 0.4s //tensorflow/core/common_runtime:arg_ret_placement_test PASSED in 0.8s //tensorflow/core/common_runtime:buf_rendezvous_test PASSED in 0.7s //tensorflow/core/common_runtime:collective_executor_mgr_test PASSED in 0.7s //tensorflow/core/common_runtime:collective_param_resolver_local_test PASSED in 4.9s //tensorflow/core/common_runtime:collective_rma_local_test PASSED in 0.7s //tensorflow/core/common_runtime:colocate_predecessor_trees_pass_test PASSED in 0.7s //tensorflow/core/common_runtime:composite_device_test PASSED in 0.3s //tensorflow/core/common_runtime:cost_measurement_registry_test PASSED in 1.9s //tensorflow/core/common_runtime:cost_util_test PASSED in 0.1s //tensorflow/core/common_runtime:device_mgr_test PASSED in 0.6s //tensorflow/core/common_runtime:device_propagation_test PASSED in 0.8s //tensorflow/core/common_runtime:device_resolver_local_test PASSED in 0.6s //tensorflow/core/common_runtime:device_set_test PASSED in 0.7s //tensorflow/core/common_runtime:direct_session_test_cpu PASSED in 1.5s //tensorflow/core/common_runtime:direct_session_with_debug_test PASSED in 1.8s //tensorflow/core/common_runtime:direct_session_with_tracking_alloc_test PASSED in 0.9s //tensorflow/core/common_runtime:dynamic_device_mgr_test PASSED in 0.7s //tensorflow/core/common_runtime:eval_const_tensor_test PASSED in 0.6s //tensorflow/core/common_runtime:executor_test PASSED in 1.3s //tensorflow/core/common_runtime:function_optimization_registration_test PASSED in 0.7s //tensorflow/core/common_runtime:function_optimization_registry_no_pass_test PASSED in 0.7s //tensorflow/core/common_runtime:function_optimization_registry_pass_failure_test PASSED in 0.6s //tensorflow/core/common_runtime:function_optimization_registry_test PASSED in 0.6s //tensorflow/core/common_runtime:function_threadpool_test PASSED in 0.8s //tensorflow/core/common_runtime:graph_constructor_test PASSED in 1.9s //tensorflow/core/common_runtime:graph_runner_test PASSED in 0.6s //tensorflow/core/common_runtime:hierarchical_tree_broadcaster_test_cpu PASSED in 2.7s //tensorflow/core/common_runtime:inline_function_utils_test PASSED in 0.4s //tensorflow/core/common_runtime:input_colocation_exemption_registry_test PASSED in 0.4s //tensorflow/core/common_runtime:int32_fulltype_test PASSED in 1.0s //tensorflow/core/common_runtime:isolate_placer_inspection_required_ops_pass_test PASSED in 0.7s //tensorflow/core/common_runtime:lower_case_op_test PASSED in 2.0s //tensorflow/core/common_runtime:lower_function_call_test PASSED in 1.9s //tensorflow/core/common_runtime:lower_functional_ops_test PASSED in 2.0s //tensorflow/core/common_runtime:lower_if_op_test PASSED in 1.9s //tensorflow/core/common_runtime:lower_while_op_test PASSED in 2.1s //tensorflow/core/common_runtime:mkl_cpu_allocator_test PASSED in 0.1s //tensorflow/core/common_runtime:mkl_threadpool_device_test PASSED in 0.1s //tensorflow/core/common_runtime:no_op_cost_measurement_test PASSED in 0.1s //tensorflow/core/common_runtime:null_request_cost_accessor_test PASSED in 0.1s //tensorflow/core/common_runtime:optimization_registry_test PASSED in 0.7s //tensorflow/core/common_runtime:optimize_cross_host_control_deps_test PASSED in 6.0s //tensorflow/core/common_runtime:optimize_function_graph_utils_test PASSED in 0.4s //tensorflow/core/common_runtime:partitioning_utils_test PASSED in 0.4s //tensorflow/core/common_runtime:pending_counts_test PASSED in 0.6s //tensorflow/core/common_runtime:permuter_test_cpu PASSED in 3.0s //tensorflow/core/common_runtime:placer_inspection_required_ops_utils_test PASSED in 0.7s //tensorflow/core/common_runtime:placer_test PASSED in 0.7s //tensorflow/core/common_runtime:process_function_library_runtime_test_cpu PASSED in 0.5s //tensorflow/core/common_runtime:process_util_test PASSED in 0.1s //tensorflow/core/common_runtime:quantize_training_test PASSED in 2.0s //tensorflow/core/common_runtime:rendezvous_util_test PASSED in 0.2s //tensorflow/core/common_runtime:replicate_constants_pass_test PASSED in 0.7s //tensorflow/core/common_runtime:replicate_per_replica_nodes_test PASSED in 0.9s //tensorflow/core/common_runtime:request_cost_accessor_registry_test PASSED in 1.9s //tensorflow/core/common_runtime:request_cost_test PASSED in 0.1s //tensorflow/core/common_runtime:ring_gatherer_test_cpu PASSED in 2.0s //tensorflow/core/common_runtime:ring_reducer_test_cpu PASSED in 4.6s //tensorflow/core/common_runtime:scoped_allocator_mgr_test PASSED in 4.4s //tensorflow/core/common_runtime:session_test PASSED in 0.6s //tensorflow/core/common_runtime:shape_refiner_test PASSED in 0.6s //tensorflow/core/common_runtime:single_threaded_executor_test PASSED in 0.6s //tensorflow/core/common_runtime:threadpool_device_test PASSED in 0.6s //tensorflow/core/common_runtime:type_inference_test PASSED in 2.1s //tensorflow/core/common_runtime/eager:attr_builder_test PASSED in 21.0s //tensorflow/core/common_runtime/eager:context_test PASSED in 11.5s //tensorflow/core/common_runtime/eager:custom_device_test PASSED in 8.9s //tensorflow/core/common_runtime/eager:eager_executor_test PASSED in 10.2s //tensorflow/core/common_runtime/eager:eager_op_rewrite_registry_test PASSED in 1.1s //tensorflow/core/common_runtime/eager:eager_operation_test PASSED in 8.6s //tensorflow/core/common_runtime/eager:execute_node_test PASSED in 9.3s //tensorflow/core/common_runtime/eager:execute_test PASSED in 20.0s //tensorflow/core/common_runtime/eager:kernel_and_device_test PASSED in 1.1s //tensorflow/core/common_runtime/eager:mkl_eager_op_rewrite_test PASSED in 11.0s //tensorflow/core/common_runtime/eager:placement_test PASSED in 9.4s //tensorflow/core/common_runtime/eager:placement_utils_test PASSED in 9.1s //tensorflow/core/common_runtime/eager:summary_optimizer_test PASSED in 0.1s //tensorflow/core/common_runtime/eager:tensor_handle_data_test PASSED in 8.3s //tensorflow/core/common_runtime/eager:tensor_handle_test PASSED in 8.3s //tensorflow/core/common_runtime/gpu:gpu_device_on_non_gpu_machine_test PASSED in 0.1s //tensorflow/core/common_runtime/gpu:gpu_serving_device_selector_test PASSED in 0.8s //tensorflow/core/common_runtime/next_pluggable_device:c_plugin_coordination_service_agent_test PASSED in 3.5s //tensorflow/core/common_runtime/next_pluggable_device/c:plugin_c_api_test PASSED in 23.3s //tensorflow/core/common_runtime/next_pluggable_device/c:tf_rendezvous_c_api_test PASSED in 0.3s //tensorflow/core/config:flags_py_test PASSED in 7.9s //tensorflow/core/config:flags_test PASSED in 0.4s //tensorflow/core/data:compression_utils_test PASSED in 1.9s //tensorflow/core/data:dataset_utils_test PASSED in 0.5s //tensorflow/core/data:hash_utils_test PASSED in 2.3s //tensorflow/core/data:metric_utils_test PASSED in 6.1s //tensorflow/core/data:name_utils_test PASSED in 0.1s //tensorflow/core/data:rewrite_utils_test PASSED in 0.9s //tensorflow/core/data:serialization_utils_test PASSED in 0.5s //tensorflow/core/data:snapshot_utils_test PASSED in 0.5s //tensorflow/core/data:split_utils_test PASSED in 0.4s //tensorflow/core/data:standalone_save_restore_test PASSED in 1.5s //tensorflow/core/data:standalone_test PASSED in 4.3s //tensorflow/core/data:tfdataz_metrics_test PASSED in 1.7s //tensorflow/core/data:unbounded_thread_pool_test PASSED in 0.7s //tensorflow/core/data:utils_test PASSED in 0.1s //tensorflow/core/data/service:auto_scaler_test PASSED in 0.6s //tensorflow/core/data/service:byte_size_test PASSED in 0.2s //tensorflow/core/data/service:common_test PASSED in 0.1s //tensorflow/core/data/service:credentials_factory_test PASSED in 0.5s //tensorflow/core/data/service:cross_trainer_cache_test PASSED in 1.9s //tensorflow/core/data/service:data_service_test PASSED in 9.6s //tensorflow/core/data/service:data_transfer_test PASSED in 0.4s //tensorflow/core/data/service:dataset_store_test PASSED in 0.6s //tensorflow/core/data/service:dispatcher_client_test PASSED in 2.8s //tensorflow/core/data/service:dispatcher_state_test PASSED in 0.6s //tensorflow/core/data/service:graph_rewriters_test PASSED in 0.5s //tensorflow/core/data/service:grpc_dispatcher_impl_test PASSED in 2.2s //tensorflow/core/data/service:grpc_util_test PASSED in 0.5s //tensorflow/core/data/service:grpc_worker_impl_test PASSED in 2.2s //tensorflow/core/data/service:journal_test PASSED in 0.4s //tensorflow/core/data/service:split_provider_test PASSED in 1.8s //tensorflow/core/data/service:task_runner_test PASSED in 2.8s //tensorflow/core/data/service:test_util_test PASSED in 1.8s //tensorflow/core/data/service:url_test PASSED in 0.1s //tensorflow/core/data/service:utils_test PASSED in 0.5s //tensorflow/core/data/service:validate_utils_test PASSED in 0.8s //tensorflow/core/data/service:worker_client_test PASSED in 2.4s //tensorflow/core/data/service:worker_impl_test PASSED in 2.1s //tensorflow/core/data/service/client:data_service_client_test PASSED in 2.9s //tensorflow/core/data/service/client:utils_test PASSED in 2.1s //tensorflow/core/data/service/client:validate_utils_test PASSED in 1.6s //tensorflow/core/data/service/snapshot:distributed_snapshot_test PASSED in 17.0s //tensorflow/core/data/service/snapshot:file_utils_test PASSED in 0.4s //tensorflow/core/data/service/snapshot:parallel_tfrecord_writer_test PASSED in 30.1s //tensorflow/core/data/service/snapshot:path_utils_test PASSED in 0.6s //tensorflow/core/data/service/snapshot:prefetched_split_provider_test PASSED in 19.2s //tensorflow/core/data/service/snapshot:snapshot_chunk_provider_test PASSED in 1.8s //tensorflow/core/data/service/snapshot:snapshot_manager_test PASSED in 1.9s //tensorflow/core/data/service/snapshot:snapshot_split_provider_test PASSED in 0.6s //tensorflow/core/data/service/snapshot:snapshot_stream_writer_checkpoint_test PASSED in 2.5s //tensorflow/core/data/service/snapshot:snapshot_stream_writer_test PASSED in 2.2s //tensorflow/core/data/service/snapshot:utils_test PASSED in 0.5s //tensorflow/core/debug:debug_graph_utils_test PASSED in 0.4s //tensorflow/core/distributed_runtime:call_options_test PASSED in 0.1s //tensorflow/core/distributed_runtime:cluster_function_library_runtime_test PASSED in 3.7s //tensorflow/core/distributed_runtime:collective_param_resolver_distributed_test PASSED in 0.6s //tensorflow/core/distributed_runtime:collective_rma_distributed_test PASSED in 0.4s //tensorflow/core/distributed_runtime:device_resolver_distributed_test PASSED in 0.4s //tensorflow/core/distributed_runtime:message_wrappers_test PASSED in 0.2s //tensorflow/core/distributed_runtime:partial_run_mgr_test PASSED in 0.9s //tensorflow/core/distributed_runtime:recent_request_ids_test PASSED in 0.1s //tensorflow/core/distributed_runtime:request_id_test PASSED in 0.8s //tensorflow/core/distributed_runtime:rpc_collective_executor_mgr_test PASSED in 0.3s //tensorflow/core/distributed_runtime:server_lib_test PASSED in 0.2s //tensorflow/core/distributed_runtime:session_mgr_test PASSED in 0.6s //tensorflow/core/distributed_runtime:tensor_coding_test PASSED in 0.7s //tensorflow/core/distributed_runtime/coordination:coordination_service_barrier_proxy_test PASSED in 3.0s //tensorflow/core/distributed_runtime/eager:eager_service_impl_test PASSED in 17.6s //tensorflow/core/distributed_runtime/eager:remote_mgr_test PASSED in 8.1s //tensorflow/core/distributed_runtime/integration_test:c_api_multi_client_test_cpu PASSED in 23.3s //tensorflow/core/distributed_runtime/integration_test:c_api_recoverable_jobs_test_cpu PASSED in 33.0s //tensorflow/core/distributed_runtime/integration_test:c_api_session_coordination_test_cpu PASSED in 22.7s //tensorflow/core/distributed_runtime/rpc:grpc_tensor_coding_test PASSED in 2.5s //tensorflow/core/distributed_runtime/rpc:grpc_worker_cache_test PASSED in 0.7s //tensorflow/core/distributed_runtime/rpc/eager:grpc_eager_client_test PASSED in 0.8s //tensorflow/core/example:example_parser_configuration_test PASSED in 0.8s //tensorflow/core/example:feature_util_test PASSED in 0.1s //tensorflow/core/framework:allocator_test PASSED in 3.5s //tensorflow/core/framework:attr_value_util_test PASSED in 0.7s //tensorflow/core/framework:batch_util_test PASSED in 0.7s //tensorflow/core/framework:bfloat16_test PASSED in 0.7s //tensorflow/core/framework:common_shape_fns_test PASSED in 0.7s //tensorflow/core/framework:dataset_test PASSED in 0.7s //tensorflow/core/framework:device_base_test PASSED in 0.7s //tensorflow/core/framework:disable_jit_test PASSED in 0.7s //tensorflow/core/framework:framework_op_gen_lib_test PASSED in 0.4s //tensorflow/core/framework:framework_op_segment_test PASSED in 0.6s //tensorflow/core/framework:framework_resource_var_test PASSED in 0.1s //tensorflow/core/framework:framework_run_handler_test PASSED in 1.5s //tensorflow/core/framework:framework_run_handler_util_test PASSED in 2.6s //tensorflow/core/framework:full_type_inference_util_test PASSED in 0.7s //tensorflow/core/framework:full_type_util_test PASSED in 0.7s //tensorflow/core/framework:function_test PASSED in 0.7s //tensorflow/core/framework:graph_def_util_test PASSED in 0.7s //tensorflow/core/framework:graph_to_functiondef_test PASSED in 0.7s //tensorflow/core/framework:kernel_def_builder_test PASSED in 0.7s //tensorflow/core/framework:kernel_def_util_test PASSED in 0.7s //tensorflow/core/framework:memory_types_test PASSED in 0.7s //tensorflow/core/framework:model_test PASSED in 0.7s //tensorflow/core/framework:node_def_builder_test PASSED in 0.7s //tensorflow/core/framework:node_def_util_test PASSED in 0.7s //tensorflow/core/framework:node_properties_test PASSED in 0.7s //tensorflow/core/framework:op_compatibility_test PASSED in 0.7s //tensorflow/core/framework:op_def_builder_test PASSED in 0.7s //tensorflow/core/framework:op_def_util_test PASSED in 0.7s //tensorflow/core/framework:op_kernel_test PASSED in 0.7s //tensorflow/core/framework:op_registration_test PASSED in 0.7s //tensorflow/core/framework:partial_tensor_shape_test PASSED in 0.7s //tensorflow/core/framework:rendezvous_test PASSED in 2.8s //tensorflow/core/framework:resource_handle_test PASSED in 0.8s //tensorflow/core/framework:resource_mgr_test PASSED in 1.7s //tensorflow/core/framework:resource_op_kernel_test PASSED in 0.8s //tensorflow/core/framework:shape_inference_test PASSED in 0.7s //tensorflow/core/framework:shape_inference_testutil_test PASSED in 0.7s //tensorflow/core/framework:tensor_matcher_test PASSED in 0.7s //tensorflow/core/framework:tensor_shape_test PASSED in 6.5s //tensorflow/core/framework:tensor_slice_test PASSED in 0.8s //tensorflow/core/framework:tensor_test PASSED in 38.4s //tensorflow/core/framework:tensor_testutil_test PASSED in 0.7s //tensorflow/core/framework:tensor_util_test PASSED in 0.8s //tensorflow/core/framework:tracking_allocator_test PASSED in 0.7s //tensorflow/core/framework:types_test PASSED in 0.8s //tensorflow/core/framework:variant_op_registry_test PASSED in 23.7s //tensorflow/core/framework:variant_test PASSED in 0.7s //tensorflow/core/framework/registration:registration_test PASSED in 0.4s //tensorflow/core/function/capture:by_ref_capture_test PASSED in 9.1s //tensorflow/core/function/capture:capture_container_test PASSED in 8.5s //tensorflow/core/function/integration_test:side_inputs_manual_api_test PASSED in 22.7s //tensorflow/core/function/integration_test:side_inputs_test PASSED in 23.4s //tensorflow/core/function/polymorphism:function_cache_test PASSED in 8.4s //tensorflow/core/function/polymorphism:function_type_test PASSED in 8.5s //tensorflow/core/function/polymorphism:type_dispatch_test PASSED in 9.0s //tensorflow/core/function/runtime_client:runtime_client_cc_test PASSED in 32.0s //tensorflow/core/function/trace_type:custom_nest_trace_type_test PASSED in 8.6s //tensorflow/core/function/trace_type:default_types_test PASSED in 8.5s //tensorflow/core/function/trace_type:serialization_test PASSED in 8.5s //tensorflow/core/function/trace_type:trace_type_test PASSED in 11.4s //tensorflow/core/graph:algorithm_test PASSED in 0.8s //tensorflow/core/graph:collective_order_test PASSED in 1.0s //tensorflow/core/graph:control_flow_test PASSED in 0.7s //tensorflow/core/graph:costmodel_test PASSED in 0.7s //tensorflow/core/graph:edgeset_test PASSED in 0.7s //tensorflow/core/graph:graph_debug_info_builder_test PASSED in 0.7s //tensorflow/core/graph:graph_def_builder_test PASSED in 0.8s //tensorflow/core/graph:graph_partition_test PASSED in 0.8s //tensorflow/core/graph:graph_test PASSED in 0.7s //tensorflow/core/graph:node_builder_test PASSED in 0.8s //tensorflow/core/graph:optimizer_cse_test PASSED in 0.8s //tensorflow/core/graph:subgraph_test PASSED in 0.8s //tensorflow/core/graph:tensor_id_test PASSED in 0.7s //tensorflow/core/graph:validate_test PASSED in 0.7s //tensorflow/core/graph/regularization:simple_delete_test PASSED in 1.0s //tensorflow/core/graph/regularization:util_test PASSED in 0.2s //tensorflow/core/grappler:graph_topology_view_test PASSED in 0.1s //tensorflow/core/grappler:graph_view_test PASSED in 1.3s //tensorflow/core/grappler:grappler_item_builder_test PASSED in 1.4s //tensorflow/core/grappler:grappler_item_test PASSED in 1.3s //tensorflow/core/grappler:mutable_graph_view_test PASSED in 1.3s //tensorflow/core/grappler:utils_test PASSED in 2.3s //tensorflow/core/grappler/clusters:single_machine_test PASSED in 24.7s //tensorflow/core/grappler/clusters:virtual_cluster_test PASSED in 1.6s //tensorflow/core/grappler/costs:analytical_cost_estimator_test PASSED in 1.6s //tensorflow/core/grappler/costs:cost_estimator_test PASSED in 0.7s //tensorflow/core/grappler/costs:graph_memory_test PASSED in 1.3s //tensorflow/core/grappler/costs:graph_properties_test PASSED in 2.7s //tensorflow/core/grappler/costs:robust_stats_test PASSED in 0.2s //tensorflow/core/grappler/costs:utils_test PASSED in 1.3s //tensorflow/core/grappler/costs:virtual_placer_test PASSED in 0.7s //tensorflow/core/grappler/costs:virtual_scheduler_test PASSED in 1.7s //tensorflow/core/grappler/graph_analyzer:gen_node_test PASSED in 2.8s //tensorflow/core/grappler/graph_analyzer:graph_analyzer_test PASSED in 2.0s //tensorflow/core/grappler/graph_analyzer:hash_tools_test PASSED in 2.7s //tensorflow/core/grappler/graph_analyzer:sig_node_test PASSED in 5.8s //tensorflow/core/grappler/graph_analyzer:subgraph_test PASSED in 1.7s //tensorflow/core/grappler/inputs:utils_test PASSED in 0.1s //tensorflow/core/grappler/optimizers:arithmetic_optimizer_test_cpu PASSED in 3.3s //tensorflow/core/grappler/optimizers:auto_mixed_precision_test_cpu PASSED in 2.2s //tensorflow/core/grappler/optimizers:auto_parallel_test_cpu PASSED in 2.7s //tensorflow/core/grappler/optimizers:common_subgraph_elimination_test_cpu PASSED in 1.9s //tensorflow/core/grappler/optimizers:custom_graph_optimizer_registry_test_cpu PASSED in 3.8s //tensorflow/core/grappler/optimizers:debug_stripper_test_cpu PASSED in 1.8s //tensorflow/core/grappler/optimizers:dependency_optimizer_test_cpu PASSED in 1.7s //tensorflow/core/grappler/optimizers:evaluation_utils_test PASSED in 0.3s //tensorflow/core/grappler/optimizers:function_api_info_test PASSED in 0.7s //tensorflow/core/grappler/optimizers:function_optimizer_test_cpu PASSED in 2.4s //tensorflow/core/grappler/optimizers:generic_layout_optimizer_test_cpu PASSED in 1.8s //tensorflow/core/grappler/optimizers:generic_layout_optimizer_transposer_factory_test PASSED in 0.2s //tensorflow/core/grappler/optimizers:generic_layout_optimizer_transposer_test_cpu PASSED in 2.5s //tensorflow/core/grappler/optimizers:graph_optimizer_stage_test_cpu PASSED in 1.7s //tensorflow/core/grappler/optimizers:implementation_selector_test PASSED in 1.8s //tensorflow/core/grappler/optimizers:loop_optimizer_test_cpu PASSED in 1.9s //tensorflow/core/grappler/optimizers:memory_optimizer_test_cpu PASSED in 2.1s //tensorflow/core/grappler/optimizers:meta_optimizer_test_cpu PASSED in 7.0s //tensorflow/core/grappler/optimizers:mkl_remapper_test PASSED in 2.3s //tensorflow/core/grappler/optimizers:model_pruner_test_cpu PASSED in 1.9s //tensorflow/core/grappler/optimizers:pin_to_host_optimizer_test_cpu PASSED in 2.2s //tensorflow/core/grappler/optimizers:remapper_test_cpu PASSED in 7.7s //tensorflow/core/grappler/optimizers:scoped_allocator_optimizer_test PASSED in 2.2s //tensorflow/core/grappler/optimizers:shape_optimizer_test_cpu PASSED in 1.8s //tensorflow/core/grappler/optimizers:static_schedule_test_cpu PASSED in 1.5s //tensorflow/core/grappler/optimizers:tfg_optimizer_hook_test PASSED in 0.5s //tensorflow/core/grappler/optimizers/data:auto_shard_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:autotune_buffer_sizes_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:batch_parallelization_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:disable_intra_op_parallelism_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:disable_prefetch_legacy_autotune_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:enable_gradient_descent_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:filter_fusion_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:filter_parallelization_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:function_utils_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:fusion_utils_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:graph_utils_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:inject_io_prefetch_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:inject_prefetch_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:make_deterministic_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:make_sloppy_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:map_and_batch_fusion_test PASSED in 1.1s //tensorflow/core/grappler/optimizers/data:map_and_filter_fusion_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:map_fusion_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:map_parallelization_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:noop_elimination_test PASSED in 0.9s //tensorflow/core/grappler/optimizers/data:parallel_batch_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:remove_compression_map_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:replicate_on_split_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/data:seq_interleave_prefetch_test PASSED in 0.5s //tensorflow/core/grappler/optimizers/data:shuffle_and_repeat_fusion_test PASSED in 0.9s //tensorflow/core/grappler/optimizers/data:slack_test PASSED in 0.6s //tensorflow/core/grappler/optimizers/data:split_utils_test PASSED in 1.4s //tensorflow/core/grappler/optimizers/data:use_private_thread_pool_test PASSED in 0.4s //tensorflow/core/grappler/optimizers/inference:batch_op_rewriter_test PASSED in 0.1s //tensorflow/core/grappler/utils:canonicalizer_test PASSED in 1.2s //tensorflow/core/grappler/utils:colocation_test PASSED in 0.4s //tensorflow/core/grappler/utils:frame_test PASSED in 0.5s //tensorflow/core/grappler/utils:functions_test PASSED in 1.4s //tensorflow/core/grappler/utils:graph_view_internal_test PASSED in 0.4s //tensorflow/core/grappler/utils:graph_view_test PASSED in 1.8s //tensorflow/core/grappler/utils:grappler_test_test PASSED in 7.3s //tensorflow/core/grappler/utils:pattern_utils_test PASSED in 0.4s //tensorflow/core/grappler/utils:scc_test PASSED in 1.3s //tensorflow/core/grappler/utils:symbolic_shapes_test PASSED in 0.3s //tensorflow/core/grappler/utils:topological_sort_test PASSED in 0.4s //tensorflow/core/grappler/utils:tpu_test PASSED in 0.6s //tensorflow/core/grappler/utils:transitive_fanin_test PASSED in 0.5s //tensorflow/core/grappler/utils:traversal_test PASSED in 0.4s //tensorflow/core/grappler/verifiers:structure_verifier_test PASSED in 1.3s //tensorflow/core/ir:interfaces_test PASSED in 0.8s //tensorflow/core/ir:ops_test PASSED in 0.1s //tensorflow/core/ir:shape_inference_utils_test PASSED in 0.6s //tensorflow/core/ir:tf_op_registry_test PASSED in 0.8s //tensorflow/core/ir:tf_op_wrapper_test PASSED in 0.2s //tensorflow/core/ir:utility_test PASSED in 0.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:arg_as_control_ret.pbtxt.test PASSED in 1.9s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:backedge_segment.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:empty.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:error_during_backedge.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:import_case_with_attr_inference.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:import_if_with_attr_inference.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:import_iterator_get_next_attr_inference.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:import_underscore_output_shapes.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:import_while_with_attr_inference.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:infeed_dequeue.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:infer_arg_handle_type.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:infer_with_output_shapes.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_arg_name.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_backedge_input_size.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_duplicated_node_name.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_edge_index.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_edge_name.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_empty_attr_key.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_empty_func_attr_key.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_empty_func_attr_name.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_empty_op_type.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_func_with_empty_name.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_function_import.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_func_with_empty_control_result.pbtxt.test PASSED in 2.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_func_with_empty_input.pbtxt.test PASSED in 1.8s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_func_with_empty_name.pbtxt.test PASSED in 2.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_func_with_empty_result.pbtxt.test PASSED in 2.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_function_attr_name.pbtxt.test PASSED in 2.2s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_generic_function_named_edge_index.pbtxt.test PASSED in 1.9s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_handle_data.pbtxt.test PASSED in 2.9s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_control_input.pbtxt.test PASSED in 2.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_control_result.pbtxt.test PASSED in 2.3s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_control_result_value.pbtxt.test PASSED in 2.4s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_data_result.pbtxt.test PASSED in 1.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_data_result_value.pbtxt.test PASSED in 2.1s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_input.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_missing_two_inputs.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_named_edge_index.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_op_name.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:invalid_type_list.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:legacy_call.pbtxt.test PASSED in 0.6s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:negative_shape.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:negative_zero_constant.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:three_nodes_with_attrs.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/graphdef_to_mlir:version.pbtxt.test PASSED in 0.5s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:empty.mlir.test PASSED in 1.8s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:fulltype.mlir.test PASSED in 1.5s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:func_with_no_args_or_results.mlir.test PASSED in 1.7s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:negative_zero_constant.mlir.test PASSED in 1.7s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:nested_legacy_call.mlir.test PASSED in 2.3s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:three_nodes_with_attrs.mlir.test PASSED in 1.9s //tensorflow/core/ir/importexport/tests/mlir_to_graphdef:version.mlir.test PASSED in 1.8s //tensorflow/core/ir/importexport/tests/saved_model:saved_model_roundtrip_test PASSED in 0.9s //tensorflow/core/ir/tests:attributes.mlir.test PASSED in 1.8s //tensorflow/core/ir/tests:canonicalize.mlir.test PASSED in 1.8s //tensorflow/core/ir/tests:compatible_types.mlir.test PASSED in 1.2s //tensorflow/core/ir/tests:concrete-ops.mlir.test PASSED in 1.5s //tensorflow/core/ir/tests:generic_concrete_ops.mlir.test PASSED in 1.6s //tensorflow/core/ir/tests:invalid-concrete-ops.mlir.test PASSED in 1.9s //tensorflow/core/ir/tests:invalid-preserved-attrs.mlir.test PASSED in 1.9s //tensorflow/core/ir/tests:invalid.mlir.test PASSED in 1.4s //tensorflow/core/ir/tests:invalid_types.mlir.test PASSED in 1.2s //tensorflow/core/ir/tests:ops.mlir.test PASSED in 1.5s //tensorflow/core/ir/tests:region-invalid-ops.mlir.test PASSED in 1.5s //tensorflow/core/ir/tests:region-ops-graph.mlir.test PASSED in 1.8s //tensorflow/core/ir/tests:region-ops.mlir.test PASSED in 1.0s //tensorflow/core/ir/tests:types.mlir.test PASSED in 1.8s //tensorflow/core/ir/types:dialect_test PASSED in 0.1s //tensorflow/core/kernels:as_string_op_test PASSED in 0.4s //tensorflow/core/kernels:basic_ops_benchmark_test PASSED in 0.4s //tensorflow/core/kernels:batch_kernels_auto_warmup_test PASSED in 0.8s //tensorflow/core/kernels:batch_kernels_env_test PASSED in 0.4s //tensorflow/core/kernels:batch_kernels_test PASSED in 34.5s //tensorflow/core/kernels:bias_op_test PASSED in 0.4s //tensorflow/core/kernels:bincount_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:broadcast_to_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:cast_op_test_cpu PASSED in 0.6s //tensorflow/core/kernels:checkpoint_callback_manager_test PASSED in 0.4s //tensorflow/core/kernels:clustering_ops_test PASSED in 0.4s //tensorflow/core/kernels:composite_tensor_variant_test PASSED in 0.4s //tensorflow/core/kernels:concat_op_test PASSED in 0.4s //tensorflow/core/kernels:constant_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:control_flow_ops_test PASSED in 6.0s //tensorflow/core/kernels:conv_grad_filter_ops_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels:conv_grad_input_ops_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels:conv_ops_benchmark_test_cpu PASSED in 0.5s //tensorflow/core/kernels:conv_ops_test_cpu PASSED in 4.7s //tensorflow/core/kernels:count_ops_test PASSED in 0.4s //tensorflow/core/kernels:cross_op_test PASSED in 0.4s //tensorflow/core/kernels:cwise_ops_test_cpu PASSED in 0.4s //tensorflow/core/kernels:debug_ops_test PASSED in 0.6s //tensorflow/core/kernels:decode_wav_op_test PASSED in 2.0s //tensorflow/core/kernels:deep_conv2d_test PASSED in 0.6s //tensorflow/core/kernels:dequantize_op_test PASSED in 0.5s //tensorflow/core/kernels:diag_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:dynamic_partition_op_test_cpu PASSED in 0.5s //tensorflow/core/kernels:dynamic_stitch_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:eigen_activations_test PASSED in 0.4s //tensorflow/core/kernels:eigen_attention_test PASSED in 0.1s //tensorflow/core/kernels:eigen_backward_cuboid_convolutions_test PASSED in 1.6s //tensorflow/core/kernels:eigen_backward_spatial_convolutions_test PASSED in 0.6s //tensorflow/core/kernels:eigen_benchmark_cpu_test PASSED in 0.1s //tensorflow/core/kernels:eigen_mkldnn_contraction_kernel_test PASSED in 0.1s //tensorflow/core/kernels:eigen_pooling_test PASSED in 0.9s //tensorflow/core/kernels:encode_wav_op_test PASSED in 1.9s //tensorflow/core/kernels:fingerprint_op_test PASSED in 0.4s //tensorflow/core/kernels:fused_batch_norm_ex_op_test_cpu PASSED in 0.6s //tensorflow/core/kernels:fused_batch_norm_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:gather_nd_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:gather_op_test_cpu PASSED in 0.5s //tensorflow/core/kernels:guarantee_const_op_test PASSED in 0.6s //tensorflow/core/kernels:identity_n_op_test PASSED in 0.4s //tensorflow/core/kernels:identity_op_test PASSED in 0.4s //tensorflow/core/kernels:immutable_constant_op_test PASSED in 0.8s //tensorflow/core/kernels:in_topk_op_test PASSED in 0.4s //tensorflow/core/kernels:isotonic_regression_op_test PASSED in 0.4s //tensorflow/core/kernels:logging_ops_test PASSED in 1.4s //tensorflow/core/kernels:lookup_ops_test PASSED in 0.4s //tensorflow/core/kernels:loss_test PASSED in 0.5s //tensorflow/core/kernels:lrn_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:merge_v2_checkpoints_op_test PASSED in 0.5s //tensorflow/core/kernels:mfcc_dct_test PASSED in 0.3s //tensorflow/core/kernels:mfcc_mel_filterbank_test PASSED in 0.1s //tensorflow/core/kernels:mfcc_op_test_cpu PASSED in 2.0s //tensorflow/core/kernels:mfcc_test PASSED in 0.5s //tensorflow/core/kernels:multinomial_op_test_cpu PASSED in 0.5s //tensorflow/core/kernels:nn_ops_test_cpu PASSED in 0.5s //tensorflow/core/kernels:one_hot_op_test PASSED in 0.4s //tensorflow/core/kernels:ops_testutil_test PASSED in 0.4s //tensorflow/core/kernels:ops_util_test PASSED in 0.2s //tensorflow/core/kernels:parameterized_truncated_normal_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:parse_tensor_test PASSED in 0.4s //tensorflow/core/kernels:quantization_utils_test PASSED in 0.5s //tensorflow/core/kernels:quantize_and_dequantize_op_test_cpu PASSED in 0.5s //tensorflow/core/kernels:quantize_down_and_shrink_range_op_test PASSED in 0.5s //tensorflow/core/kernels:quantize_op_test PASSED in 0.6s //tensorflow/core/kernels:quantized_activation_ops_test PASSED in 0.4s //tensorflow/core/kernels:quantized_add_op_test PASSED in 0.9s //tensorflow/core/kernels:quantized_batch_norm_op_test PASSED in 0.4s //tensorflow/core/kernels:quantized_bias_add_op_test PASSED in 0.4s //tensorflow/core/kernels:quantized_concat_op_test PASSED in 0.4s //tensorflow/core/kernels:quantized_conv_ops_test PASSED in 0.4s //tensorflow/core/kernels:quantized_instance_norm_test PASSED in 0.7s //tensorflow/core/kernels:quantized_matmul_op_test PASSED in 0.4s //tensorflow/core/kernels:quantized_mul_op_test PASSED in 0.9s //tensorflow/core/kernels:quantized_pooling_ops_test PASSED in 0.4s //tensorflow/core/kernels:quantized_reshape_op_test PASSED in 0.4s //tensorflow/core/kernels:quantized_resize_bilinear_op_test PASSED in 1.5s //tensorflow/core/kernels:ragged_fill_empty_rows_op_test PASSED in 0.4s //tensorflow/core/kernels:ragged_gather_op_test PASSED in 0.4s //tensorflow/core/kernels:ragged_range_op_test PASSED in 0.4s //tensorflow/core/kernels:ragged_tensor_from_variant_op_test PASSED in 0.4s //tensorflow/core/kernels:ragged_tensor_to_sparse_kernel_test PASSED in 0.4s //tensorflow/core/kernels:ragged_tensor_to_tensor_op_test PASSED in 0.4s //tensorflow/core/kernels:ragged_tensor_to_variant_op_test PASSED in 0.4s //tensorflow/core/kernels:random_binomial_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:random_index_shuffle_test PASSED in 0.6s //tensorflow/core/kernels:random_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:random_poisson_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:range_sampler_test PASSED in 0.8s //tensorflow/core/kernels:reduction_ops_test_cpu PASSED in 0.3s //tensorflow/core/kernels:regex_replace_op_test PASSED in 0.4s //tensorflow/core/kernels:requantization_range_op_test PASSED in 0.4s //tensorflow/core/kernels:requantize_op_test PASSED in 0.4s //tensorflow/core/kernels:resource_ops_test PASSED in 0.4s //tensorflow/core/kernels:restore_op_test PASSED in 0.4s //tensorflow/core/kernels:restore_v2_op_test PASSED in 0.4s //tensorflow/core/kernels:reverse_op_test PASSED in 0.4s //tensorflow/core/kernels:roll_op_test PASSED in 0.4s //tensorflow/core/kernels:save_op_test PASSED in 0.4s //tensorflow/core/kernels:save_v2_op_test PASSED in 0.4s //tensorflow/core/kernels:scan_ops_test_cpu PASSED in 0.3s //tensorflow/core/kernels:scatter_nd_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:scatter_op_test PASSED in 0.4s //tensorflow/core/kernels:scoped_allocator_ops_test_cpu PASSED in 6.0s //tensorflow/core/kernels:sdca_ops_test PASSED in 1.2s //tensorflow/core/kernels:segment_reduction_ops_test PASSED in 0.3s //tensorflow/core/kernels:sendrecv_ops_test PASSED in 0.3s //tensorflow/core/kernels:sequence_ops_test PASSED in 0.4s //tensorflow/core/kernels:shape_ops_test PASSED in 0.3s //tensorflow/core/kernels:slice_op_test PASSED in 0.4s //tensorflow/core/kernels:spacetobatch_benchmark_test_cpu PASSED in 0.3s //tensorflow/core/kernels:sparse_add_op_test PASSED in 0.4s //tensorflow/core/kernels:sparse_dense_binary_op_shared_test PASSED in 0.4s //tensorflow/core/kernels:sparse_fill_empty_rows_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:sparse_matmul_op_test_cpu PASSED in 0.3s //tensorflow/core/kernels:sparse_reduce_sum_op_test PASSED in 0.4s //tensorflow/core/kernels:sparse_tensor_dense_matmul_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:sparse_to_dense_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:sparse_utils_test PASSED in 0.7s //tensorflow/core/kernels:sparse_xent_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:spectrogram_op_test_cpu PASSED in 1.8s //tensorflow/core/kernels:spectrogram_test PASSED in 0.7s //tensorflow/core/kernels:split_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:split_v_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels:strided_slice_op_test PASSED in 0.4s //tensorflow/core/kernels:string_format_op_test PASSED in 0.4s //tensorflow/core/kernels:string_ngrams_op_test PASSED in 0.4s //tensorflow/core/kernels:string_split_op_test PASSED in 0.4s //tensorflow/core/kernels:substr_op_test PASSED in 0.4s //tensorflow/core/kernels:summary_audio_op_test PASSED in 0.4s //tensorflow/core/kernels:summary_image_op_test PASSED in 0.4s //tensorflow/core/kernels:summary_op_test PASSED in 0.4s //tensorflow/core/kernels:summary_tensor_op_test PASSED in 0.4s //tensorflow/core/kernels:tensor_cord_test PASSED in 0.2s //tensorflow/core/kernels:tensor_flag_utils_test PASSED in 0.1s //tensorflow/core/kernels:tensor_map_test PASSED in 0.1s //tensorflow/core/kernels:training_ops_test PASSED in 0.4s //tensorflow/core/kernels:transpose_util_test PASSED in 0.4s //tensorflow/core/kernels:unary_ops_composition_test_cpu PASSED in 1.8s //tensorflow/core/kernels:unique_op_test PASSED in 0.4s //tensorflow/core/kernels:variable_ops_test PASSED in 1.5s //tensorflow/core/kernels:while_op_test PASSED in 0.7s //tensorflow/core/kernels:xent_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels/batching_util:basic_batch_scheduler_test PASSED in 0.7s //tensorflow/core/kernels/batching_util:batch_input_task_test PASSED in 0.9s //tensorflow/core/kernels/batching_util:batch_resource_base_test PASSED in 0.1s //tensorflow/core/kernels/batching_util:batch_scheduler_test PASSED in 0.8s //tensorflow/core/kernels/batching_util:bounded_executor_test PASSED in 27.0s //tensorflow/core/kernels/batching_util:input_split_metadata_test PASSED in 0.1s //tensorflow/core/kernels/batching_util:periodic_function_test PASSED in 1.7s //tensorflow/core/kernels/batching_util:serial_device_batch_scheduler_test PASSED in 1.7s //tensorflow/core/kernels/batching_util:shared_batch_scheduler_test PASSED in 6.7s //tensorflow/core/kernels/batching_util:threadsafe_status_test PASSED in 0.1s //tensorflow/core/kernels/data:batch_dataset_op_test PASSED in 0.6s //tensorflow/core/kernels/data:cache_dataset_ops_test PASSED in 0.6s //tensorflow/core/kernels/data:concatenate_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:filter_dataset_op_test PASSED in 0.6s //tensorflow/core/kernels/data:finalize_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:fixed_length_record_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:flat_map_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:get_options_op_test PASSED in 0.4s //tensorflow/core/kernels/data:interleave_dataset_op_test PASSED in 0.6s //tensorflow/core/kernels/data:iterator_ops_test PASSED in 0.5s //tensorflow/core/kernels/data:map_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data:map_defun_op_test PASSED in 0.4s //tensorflow/core/kernels/data:optimize_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:options_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data:padded_batch_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:parallel_batch_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:parallel_filter_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data:parallel_interleave_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data:parallel_map_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data:prefetch_autotuner_test PASSED in 0.6s //tensorflow/core/kernels/data:prefetch_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:range_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:reduce_dataset_op_test PASSED in 0.6s //tensorflow/core/kernels/data:repeat_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:rewrite_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:shard_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:shuffle_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:skip_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:sparse_tensor_slice_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:take_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:tensor_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data:tensor_slice_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:text_line_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:tf_record_dataset_op_test PASSED in 1.1s //tensorflow/core/kernels/data:window_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data:zip_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data/experimental:assert_next_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data/experimental:assert_prev_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data/experimental:auto_shard_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data/experimental:directed_interleave_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data/experimental:list_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data/experimental:map_and_batch_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data/experimental:parallel_interleave_dataset_op_test PASSED in 0.5s //tensorflow/core/kernels/data/experimental:random_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data/experimental:sampling_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/data/experimental:save_dataset_op_test PASSED in 0.7s //tensorflow/core/kernels/data/experimental:unique_dataset_op_test PASSED in 0.4s //tensorflow/core/kernels/image:adjust_contrast_op_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels/image:adjust_contrast_op_test PASSED in 0.4s //tensorflow/core/kernels/image:colorspace_op_test PASSED in 0.4s //tensorflow/core/kernels/image:crop_and_resize_op_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels/image:crop_and_resize_op_test PASSED in 0.4s //tensorflow/core/kernels/image:encode_jpeg_op_test PASSED in 0.4s //tensorflow/core/kernels/image:mirror_pad_op_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels/image:mirror_pad_op_test PASSED in 0.5s //tensorflow/core/kernels/image:non_max_suppression_op_benchmark_test PASSED in 0.4s //tensorflow/core/kernels/image:non_max_suppression_op_test PASSED in 0.4s //tensorflow/core/kernels/image:resize_area_op_test PASSED in 0.8s //tensorflow/core/kernels/image:resize_benchmark_test_cpu PASSED in 0.4s //tensorflow/core/kernels/image:resize_ops_test_cpu PASSED in 1.9s //tensorflow/core/kernels/image:sampling_kernels_test PASSED in 0.4s //tensorflow/core/kernels/image:scale_and_translate_op_test PASSED in 1.5s //tensorflow/core/kernels/linalg:banded_triangular_solve_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels/linalg:matrix_triangular_solve_op_test_cpu PASSED in 0.4s //tensorflow/core/kernels/mkl:mkl_conv_ops_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_dequantize_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_fused_batch_norm_op_test PASSED in 0.2s //tensorflow/core/kernels/mkl:mkl_fused_ops_test PASSED in 0.6s //tensorflow/core/kernels/mkl:mkl_matmul_op_benchmark PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_qmatmul_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_quantize_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_quantized_concat_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_quantized_conv_ops_perchannel_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_quantized_conv_ops_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_quantized_pooling_ops_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_relu_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_requantize_ops_test PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_sparse_matrix_matmul_op_benchmark PASSED in 0.1s //tensorflow/core/kernels/mkl:mkl_swish_op_test PASSED in 0.1s //tensorflow/core/kernels/mkl:onednn_nn_ops_benchmark PASSED in 0.1s //tensorflow/core/kernels/sparse:kernels_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:math_utils_test PASSED in 0.1s //tensorflow/core/kernels/uniform_quant_ops:tensor_utils_test PASSED in 0.1s //tensorflow/core/kernels/uniform_quant_ops:uniform_dequantize_op_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:uniform_quantize_op_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:uniform_quantized_add_op_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:uniform_quantized_clip_by_value_op_test PASSED in 0.5s //tensorflow/core/kernels/uniform_quant_ops:uniform_quantized_convolution_ops_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:uniform_quantized_dot_ops_test PASSED in 0.4s //tensorflow/core/kernels/uniform_quant_ops:uniform_requantize_op_test PASSED in 0.5s //tensorflow/core/lib/db:sqlite_test PASSED in 0.8s //tensorflow/core/lib/gif:lib_gif_io_test PASSED in 4.6s //tensorflow/core/lib/jpeg:lib_jpeg_jpeg_mem_unittest PASSED in 1.2s //tensorflow/core/ops:cudnn_rnn_ops_test_cc PASSED in 0.4s //tensorflow/core/ops:ops_array_grad_test PASSED in 1.1s //tensorflow/core/ops:ops_math_grad_test PASSED in 3.9s //tensorflow/core/ops:ops_tests PASSED in 0.5s //tensorflow/core/ops/compat:backwards_compatibility_test PASSED in 1.1s //tensorflow/core/platform:enable_tf2_utils_test PASSED in 0.7s //tensorflow/core/platform:env_test PASSED in 3.7s //tensorflow/core/platform:fake_python_env_test PASSED in 0.1s //tensorflow/core/platform:file_system_test PASSED in 1.0s //tensorflow/core/platform:platform_strings_test PASSED in 0.1s //tensorflow/core/platform:ram_file_system_test PASSED in 10.5s //tensorflow/core/platform:resource_loader_test PASSED in 0.3s //tensorflow/core/platform:vmodule_benchmark_test PASSED in 0.1s //tensorflow/core/platform:vmodule_test PASSED in 0.8s //tensorflow/core/profiler/convert:dcn_analysis_test PASSED in 0.1s //tensorflow/core/profiler/convert:dcn_utils_test PASSED in 0.5s //tensorflow/core/profiler/convert:hlo_proto_to_graph_view_test PASSED in 0.9s //tensorflow/core/profiler/convert:hlo_proto_to_memory_visualization_utils_test PASSED in 0.9s //tensorflow/core/profiler/convert:op_stats_combiner_test PASSED in 0.1s //tensorflow/core/profiler/convert:op_stats_to_pod_stats_test PASSED in 0.1s //tensorflow/core/profiler/convert:op_stats_to_pod_viewer_test PASSED in 0.7s //tensorflow/core/profiler/convert:op_stats_to_tf_stats_test PASSED in 0.4s //tensorflow/core/profiler/convert:repository_test PASSED in 0.2s //tensorflow/core/profiler/convert:xplane_to_dcn_collective_stats_test PASSED in 0.1s //tensorflow/core/profiler/convert:xplane_to_kernel_stats_db_test PASSED in 0.1s //tensorflow/core/profiler/convert:xplane_to_memory_profile_test PASSED in 0.2s //tensorflow/core/profiler/convert:xplane_to_op_metrics_db_test PASSED in 0.2s //tensorflow/core/profiler/convert:xplane_to_op_stats_test PASSED in 1.0s //tensorflow/core/profiler/convert:xplane_to_step_events_test PASSED in 0.2s //tensorflow/core/profiler/convert:xplane_to_tf_functions_test PASSED in 0.9s //tensorflow/core/profiler/convert:xplane_to_tool_names_test PASSED in 0.2s //tensorflow/core/profiler/convert/trace_viewer:trace_viewer_visibility_test PASSED in 0.1s //tensorflow/core/profiler/internal:tfprof_show_test PASSED in 0.7s //tensorflow/core/profiler/internal:tfprof_stats_test PASSED in 0.5s //tensorflow/core/profiler/internal:tfprof_tensor_test PASSED in 0.4s //tensorflow/core/profiler/internal:tfprof_timeline_test PASSED in 0.5s //tensorflow/core/profiler/internal/advisor:tfprof_advisor_test PASSED in 0.4s //tensorflow/core/profiler/lib:profiler_disabled_test PASSED in 0.7s //tensorflow/core/profiler/utils:derived_timeline_test PASSED in 0.8s //tensorflow/core/profiler/utils:kernel_stats_utils_test PASSED in 0.1s //tensorflow/core/profiler/utils:op_metrics_db_utils_test PASSED in 0.9s //tensorflow/core/profiler/utils:step_intersection_test PASSED in 0.8s //tensorflow/core/runtime_fallback/util:type_util_test PASSED in 0.8s //tensorflow/core/summary:schema_test PASSED in 0.1s //tensorflow/core/summary:summary_db_writer_test PASSED in 0.2s //tensorflow/core/summary:summary_file_writer_test PASSED in 0.9s //tensorflow/core/tfrt/common:pjrt_cpu_client_registration_test PASSED in 6.0s //tensorflow/core/tfrt/common:pjrt_state_test PASSED in 6.1s //tensorflow/core/tfrt/common:pjrt_util_test PASSED in 5.7s //tensorflow/core/tfrt/fallback:cost_recorder_test PASSED in 0.2s //tensorflow/core/tfrt/fallback:fallback_state_test PASSED in 0.9s //tensorflow/core/tfrt/graph_executor:config_test PASSED in 0.1s //tensorflow/core/tfrt/mlrt/attribute:attribute_test PASSED in 0.7s //tensorflow/core/tfrt/mlrt/bytecode:bytecode_test PASSED in 0.2s //tensorflow/core/tfrt/mlrt/bytecode:executable_test PASSED in 0.1s //tensorflow/core/tfrt/mlrt/bytecode:function_test PASSED in 0.7s //tensorflow/core/tfrt/mlrt/bytecode:kernel_test PASSED in 0.1s //tensorflow/core/tfrt/mlrt/bytecode:span_test PASSED in 0.5s //tensorflow/core/tfrt/mlrt/interpreter:context_test PASSED in 0.6s //tensorflow/core/tfrt/mlrt/interpreter:future_test PASSED in 0.1s //tensorflow/core/tfrt/mlrt/interpreter:interpreter_test PASSED in 0.1s //tensorflow/core/tfrt/mlrt/interpreter:register_span_test PASSED in 0.6s //tensorflow/core/tfrt/mlrt/interpreter:value_test PASSED in 0.1s //tensorflow/core/tfrt/run_handler_thread_pool:run_handler_concurrent_work_queue_test PASSED in 0.7s //tensorflow/core/tfrt/run_handler_thread_pool:run_handler_test PASSED in 1.6s //tensorflow/core/tfrt/run_handler_thread_pool:run_handler_util_test PASSED in 0.1s //tensorflow/core/tfrt/runtime:tf_threadpool_concurrent_work_queue_test PASSED in 0.1s //tensorflow/core/tfrt/runtime:work_queue_interface_test PASSED in 0.7s //tensorflow/core/tfrt/utils:graph_partition_test PASSED in 2.1s //tensorflow/core/transforms:eval_utils_test PASSED in 1.3s //tensorflow/core/transforms:graph_transform_wrapper_test PASSED in 0.7s //tensorflow/core/util:bcast_test PASSED in 0.7s //tensorflow/core/util:command_line_flags_test PASSED in 0.6s //tensorflow/core/util:debug_data_dumper_test PASSED in 0.7s //tensorflow/core/util:debug_events_writer_test PASSED in 0.8s //tensorflow/core/util:dump_graph_test PASSED in 0.7s //tensorflow/core/util:equal_graph_def_test PASSED in 0.7s //tensorflow/core/util:events_writer_test PASSED in 2.7s //tensorflow/core/util:example_proto_fast_parsing_test PASSED in 1.0s //tensorflow/core/util:example_proto_helper_test PASSED in 0.8s //tensorflow/core/util:exec_on_stall_test PASSED in 2.4s //tensorflow/core/util:fake_clock_env_test PASSED in 1.9s //tensorflow/core/util:incremental_barrier_test PASSED in 0.1s //tensorflow/core/util:matmul_bcast_test PASSED in 0.7s //tensorflow/core/util:memmapped_file_system_test PASSED in 0.6s //tensorflow/core/util:mkl_heuristics_test PASSED in 0.1s //tensorflow/core/util:overflow_test PASSED in 0.2s //tensorflow/core/util:presized_cuckoo_map_test PASSED in 2.3s //tensorflow/core/util:ragged_to_dense_util_test PASSED in 0.3s //tensorflow/core/util:reffed_status_callback_test PASSED in 0.7s //tensorflow/core/util:reporter_test PASSED in 0.7s //tensorflow/core/util:saved_tensor_slice_util_test PASSED in 0.6s //tensorflow/core/util:semver_test PASSED in 0.7s //tensorflow/core/util:stat_summarizer_test PASSED in 0.7s //tensorflow/core/util:strided_slice_op_test PASSED in 0.6s //tensorflow/core/util:tensor_format_test PASSED in 0.6s //tensorflow/core/util:tensor_slice_reader_test PASSED in 0.7s //tensorflow/core/util:tensor_slice_set_test PASSED in 0.7s //tensorflow/core/util:tensor_slice_util_test PASSED in 0.6s //tensorflow/core/util:tensor_slice_writer_test PASSED in 1.2s //tensorflow/core/util:work_sharder_test PASSED in 0.8s //tensorflow/core/util/ctc:ctc_beam_search_test PASSED in 0.1s //tensorflow/core/util/proto:descriptor_pool_registry_test PASSED in 0.3s //tensorflow/core/util/proto:proto_utils_test PASSED in 0.3s //tensorflow/core/util/quantization:uniform_quant_ops_params_test PASSED in 0.5s //tensorflow/core/util/sparse:sparse_tensor_test PASSED in 0.7s //tensorflow/core/util/tensor_bundle:tensor_bundle_test PASSED in 37.2s //tensorflow/dtensor/mlir:dtensor_location_test PASSED in 0.1s //tensorflow/dtensor/mlir/tests:annotate_global_shape.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:cluster_function_conversion.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:constant_folding.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:decompose_controlflow.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:designate_resource_handle_mesh.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:device_mesh_cluster_coarsening.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_all_gather.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_all_scatter.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_allreduce_combine_optimization.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_allreduce_lowering.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_allreduce_scatter_optimization.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_allreduce_sum_optimization.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_alltoall_lowering.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_collective_type_lowering.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_layout_must_execute.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_layout_to_xla_sharding_op.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:dtensor_mixed_precision_reduce.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_reduce_scatter_lowering.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_remove_dtensorlayout.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_replace_auxiliary_layout_op.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_replace_relayout_with_identity.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_set_hlo_sharding.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_set_hlo_sharding_default.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:dtensor_xla_spmd_integration.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:elide_identity_before_copy_to_mesh.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:function_renaming.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:handle_cross_cluster_dependencies.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:handle_sparsetensors.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:layout_propagation_v2.mlir.test PASSED in 0.7s //tensorflow/dtensor/mlir/tests:lower_send_recv.mlir.test PASSED in 1.5s //tensorflow/dtensor/mlir/tests:merge_clusters.mlir.test PASSED in 2.4s //tensorflow/dtensor/mlir/tests:mesh_propagation.mlir.test PASSED in 1.7s //tensorflow/dtensor/mlir/tests:multi_device_expansion.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:op_to_device_cluster.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:propagate_default_layout.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:propagate_device_id_to_function.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:restore_and_assign.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:restore_shape_inference.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:set_default_sharding.mlir.test PASSED in 0.5s //tensorflow/dtensor/mlir/tests:sparse_expansion.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_batchparallel.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_concat.mlir.test PASSED in 0.8s //tensorflow/dtensor/mlir/tests:spmd_conv.mlir.test PASSED in 0.7s //tensorflow/dtensor/mlir/tests:spmd_einsum.mlir.test PASSED in 0.7s //tensorflow/dtensor/mlir/tests:spmd_expansion.mlir.test PASSED in 0.8s //tensorflow/dtensor/mlir/tests:spmd_fft.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_io_ops.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_iterator.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_matmul.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_random.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_save_restore.mlir.test PASSED in 0.7s //tensorflow/dtensor/mlir/tests:spmd_segment_sum.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_slice.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_softmax_loss.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_squeeze.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:spmd_var_handle.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:tf_dtensor_ops.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:tpu_add_resource_device_attribute.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:tpu_integration.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:undo_merge_const_across_mesh.mlir.test PASSED in 0.6s //tensorflow/dtensor/mlir/tests:update_tpu_metadata.mlir.test PASSED in 0.6s //tensorflow/dtensor/python/tests:api_test PASSED in 31.0s //tensorflow/dtensor/python/tests:array_ops_test_cpu PASSED in 23.1s //tensorflow/dtensor/python/tests:cache_test_cpu PASSED in 19.9s //tensorflow/dtensor/python/tests:collective_combine_all_reduce_test_cpu PASSED in 19.7s //tensorflow/dtensor/python/tests:collective_test_cpu PASSED in 22.2s //tensorflow/dtensor/python/tests:config_test_cpu PASSED in 10.3s //tensorflow/dtensor/python/tests:device_test_cpu PASSED in 48.1s //tensorflow/dtensor/python/tests:layout_test_cpu PASSED in 22.3s //tensorflow/dtensor/python/tests:mesh_util_test_cpu PASSED in 12.9s //tensorflow/dtensor/python/tests:multi_client_test_cpu PASSED in 83.0s //tensorflow/dtensor/python/tests:numpy_util_test_cpu PASSED in 79.2s //tensorflow/dtensor/python/tests:variable_test_cpu PASSED in 12.9s //tensorflow/dtensor/tests:dtensor_operation_test PASSED in 23.3s //tensorflow/dtensor/tests:executable_manager_test PASSED in 23.9s //tensorflow/dtensor/tests:layout_to_xla_sharding_test PASSED in 0.2s //tensorflow/dtensor/tests:slice_util_test PASSED in 0.3s //tensorflow/dtensor/tests:spmd_expander_test PASSED in 6.3s //tensorflow/dtensor/tests:tensor_layout_test PASSED in 0.2s //tensorflow/examples/adding_an_op:fact_test PASSED in 106.4s //tensorflow/examples/adding_an_op:zero_out_1_test PASSED in 52.0s //tensorflow/examples/adding_an_op:zero_out_2_test PASSED in 48.4s //tensorflow/examples/adding_an_op:zero_out_3_test PASSED in 47.2s //tensorflow/examples/custom_ops_doc/multiplex_1:multiplex_1_test PASSED in 45.5s //tensorflow/examples/custom_ops_doc/multiplex_2:multiplex_2_test_cpu PASSED in 24.6s //tensorflow/examples/custom_ops_doc/multiplex_3:multiplex_3_test PASSED in 46.1s //tensorflow/examples/custom_ops_doc/multiplex_4:multiplex_4_test PASSED in 102.1s //tensorflow/examples/custom_ops_doc/simple_hash_table:simple_hash_table_test PASSED in 98.1s //tensorflow/examples/custom_ops_doc/sleep:sleep_test PASSED in 127.8s //tensorflow/examples/speech_commands:accuracy_utils_test PASSED in 2.0s //tensorflow/examples/speech_commands:models_test PASSED in 125.1s //tensorflow/examples/speech_commands:recognize_commands_test PASSED in 1.8s //tensorflow/examples/wav_to_spectrogram:wav_to_spectrogram_test PASSED in 1.7s //tensorflow/js:ts_op_gen_test PASSED in 1.6s //tensorflow/python/autograph/converters:asserts_test PASSED in 8.8s //tensorflow/python/autograph/converters:break_statements_test PASSED in 9.2s //tensorflow/python/autograph/converters:call_trees_test PASSED in 9.4s //tensorflow/python/autograph/converters:conditional_expressions_test PASSED in 8.8s //tensorflow/python/autograph/converters:continue_statements_test PASSED in 9.8s //tensorflow/python/autograph/converters:control_flow_test PASSED in 14.1s //tensorflow/python/autograph/converters:directives_test PASSED in 9.1s //tensorflow/python/autograph/converters:functions_test PASSED in 8.9s //tensorflow/python/autograph/converters:lists_test PASSED in 9.4s //tensorflow/python/autograph/converters:logical_expressions_test PASSED in 9.8s //tensorflow/python/autograph/converters:return_statements_test PASSED in 10.7s //tensorflow/python/autograph/converters:slices_test PASSED in 8.9s //tensorflow/python/autograph/converters:variables_test PASSED in 8.7s //tensorflow/python/autograph/core:converter_test PASSED in 9.0s //tensorflow/python/autograph/core:function_wrappers_test PASSED in 8.2s //tensorflow/python/autograph/impl:api_test PASSED in 19.2s //tensorflow/python/autograph/impl:conversion_test PASSED in 11.7s //tensorflow/python/autograph/lang:special_functions_test PASSED in 8.2s //tensorflow/python/autograph/operators:conditional_expressions_test PASSED in 8.7s //tensorflow/python/autograph/operators:control_flow_test PASSED in 16.3s //tensorflow/python/autograph/operators:data_structures_test PASSED in 8.6s //tensorflow/python/autograph/operators:exceptions_test PASSED in 8.5s //tensorflow/python/autograph/operators:logical_test PASSED in 8.6s //tensorflow/python/autograph/operators:py_builtins_test PASSED in 15.3s //tensorflow/python/autograph/operators:slices_test PASSED in 8.8s //tensorflow/python/autograph/operators:variables_test PASSED in 8.7s //tensorflow/python/autograph/pyct:anno_test PASSED in 8.4s //tensorflow/python/autograph/pyct:ast_util_test PASSED in 8.7s //tensorflow/python/autograph/pyct:cache_test PASSED in 8.3s //tensorflow/python/autograph/pyct:cfg_test PASSED in 9.0s //tensorflow/python/autograph/pyct:error_utils_test PASSED in 8.2s //tensorflow/python/autograph/pyct:inspect_utils_test PASSED in 9.3s //tensorflow/python/autograph/pyct:loader_test PASSED in 8.4s //tensorflow/python/autograph/pyct:naming_test PASSED in 8.5s //tensorflow/python/autograph/pyct:origin_info_test PASSED in 8.5s //tensorflow/python/autograph/pyct:parser_test PASSED in 9.0s //tensorflow/python/autograph/pyct:pretty_printer_test PASSED in 8.4s //tensorflow/python/autograph/pyct:qual_names_test PASSED in 8.6s //tensorflow/python/autograph/pyct:templates_test PASSED in 8.2s //tensorflow/python/autograph/pyct:transformer_test PASSED in 8.6s //tensorflow/python/autograph/pyct:transpiler_test PASSED in 8.5s //tensorflow/python/autograph/pyct/static_analysis:activity_test PASSED in 8.5s //tensorflow/python/autograph/pyct/static_analysis:liveness_test PASSED in 8.8s //tensorflow/python/autograph/pyct/static_analysis:reaching_definitions_test PASSED in 8.8s //tensorflow/python/autograph/pyct/static_analysis:reaching_fndefs_test PASSED in 8.6s //tensorflow/python/autograph/pyct/static_analysis:type_inference_test PASSED in 9.0s //tensorflow/python/autograph/tests:assertion_test PASSED in 67.4s //tensorflow/python/autograph/tests:basic_ifexp_test PASSED in 26.0s //tensorflow/python/autograph/tests:call_to_builtin_function_test PASSED in 24.8s //tensorflow/python/autograph/tests:call_to_lambda_function_test PASSED in 24.9s //tensorflow/python/autograph/tests:call_to_named_tuple_test PASSED in 27.0s //tensorflow/python/autograph/tests:call_to_numpy_function_test PASSED in 26.8s //tensorflow/python/autograph/tests:call_to_print_function_test PASSED in 27.7s //tensorflow/python/autograph/tests:call_to_tf_api_test PASSED in 25.6s //tensorflow/python/autograph/tests:call_to_user_function_test PASSED in 26.3s //tensorflow/python/autograph/tests:composite_names_in_control_flow_test PASSED in 35.4s //tensorflow/python/autograph/tests:cond_basic_test PASSED in 36.7s //tensorflow/python/autograph/tests:datasets_test PASSED in 31.0s //tensorflow/python/autograph/tests:early_return_test PASSED in 31.5s //tensorflow/python/autograph/tests:ext_slice_test PASSED in 26.3s //tensorflow/python/autograph/tests:generator_test PASSED in 24.3s //tensorflow/python/autograph/tests:logical_expression_test PASSED in 27.7s //tensorflow/python/autograph/tests:loop_basic_test PASSED in 80.5s //tensorflow/python/autograph/tests:loop_control_flow_illegal_cases_test PASSED in 25.7s //tensorflow/python/autograph/tests:loop_created_variables_test PASSED in 28.6s //tensorflow/python/autograph/tests:loop_scoping_test PASSED in 29.4s //tensorflow/python/autograph/tests:loop_with_function_call_test PASSED in 36.5s //tensorflow/python/autograph/tests:loop_with_variable_type_illegal_cases_test PASSED in 28.5s //tensorflow/python/autograph/tests:loop_with_variable_type_test PASSED in 42.3s //tensorflow/python/autograph/tests:nested_control_flow_test PASSED in 149.6s //tensorflow/python/autograph/tests:type_annotations_test PASSED in 24.9s //tensorflow/python/autograph/utils:context_managers_test PASSED in 9.2s //tensorflow/python/autograph/utils:misc_test PASSED in 9.5s //tensorflow/python/autograph/utils:tensor_list_test PASSED in 9.7s //tensorflow/python/autograph/utils:tensors_test PASSED in 9.7s //tensorflow/python/checkpoint:checkpoint_management_test_cpu PASSED in 22.1s //tensorflow/python/checkpoint:checkpoint_metrics_test PASSED in 19.3s //tensorflow/python/checkpoint:checkpoint_test PASSED in 40.3s //tensorflow/python/checkpoint:checkpoint_view_test PASSED in 12.9s //tensorflow/python/checkpoint:checkpoint_with_v1_optimizers_test PASSED in 15.1s //tensorflow/python/checkpoint:functional_saver_test_cpu PASSED in 14.4s //tensorflow/python/checkpoint:restore_test PASSED in 12.2s //tensorflow/python/checkpoint:save_util_v1_test PASSED in 11.7s //tensorflow/python/checkpoint:saveable_compat_test PASSED in 15.3s //tensorflow/python/checkpoint:tensor_callable_test PASSED in 14.8s //tensorflow/python/checkpoint:trackable_view_test PASSED in 8.9s //tensorflow/python/checkpoint/sharding:sharding_policies_test PASSED in 18.1s //tensorflow/python/checkpoint/sharding:sharding_util_test PASSED in 14.5s //tensorflow/python/client:device_lib_test_cpu PASSED in 13.9s //tensorflow/python/client:events_writer_test PASSED in 12.0s //tensorflow/python/client:session_list_devices_test PASSED in 12.8s //tensorflow/python/client:session_partial_run_test PASSED in 19.7s //tensorflow/python/client:timeline_test_cpu PASSED in 10.6s //tensorflow/python/client:virtual_gpu_test_cpu PASSED in 10.6s //tensorflow/python/compat:compat_test PASSED in 10.0s //tensorflow/python/compat:disable_v2_behavior_test PASSED in 10.1s //tensorflow/python/compiler/mlir:mlir_test PASSED in 9.3s //tensorflow/python/compiler/tensorrt/test:batch_matmul_test_cpu PASSED in 13.0s //tensorflow/python/compiler/tensorrt/test:biasadd_matmul_test_cpu PASSED in 15.6s //tensorflow/python/compiler/tensorrt/test:bool_test_cpu PASSED in 13.1s //tensorflow/python/compiler/tensorrt/test:cast_test_cpu PASSED in 11.7s //tensorflow/python/compiler/tensorrt/test:concatenation_test_cpu PASSED in 11.8s //tensorflow/python/compiler/tensorrt/test:const_broadcast_test_cpu PASSED in 14.0s //tensorflow/python/compiler/tensorrt/test:data_dependent_shape_test_cpu PASSED in 12.7s //tensorflow/python/compiler/tensorrt/test:dynamic_input_shapes_test_cpu PASSED in 14.1s //tensorflow/python/compiler/tensorrt/test:identity_output_test_cpu PASSED in 14.0s //tensorflow/python/compiler/tensorrt/test:int32_test_cpu PASSED in 14.4s //tensorflow/python/compiler/tensorrt/test:lru_cache_test_cpu PASSED in 12.9s //tensorflow/python/compiler/tensorrt/test:multi_connection_neighbor_engine_test_cpu PASSED in 12.3s //tensorflow/python/compiler/tensorrt/test:neighboring_engine_test_cpu PASSED in 14.9s //tensorflow/python/compiler/tensorrt/test:quantization_test_cpu PASSED in 12.8s //tensorflow/python/compiler/tensorrt/test:rank_two_test_cpu PASSED in 82.8s //tensorflow/python/compiler/tensorrt/test:reshape_transpose_test_cpu PASSED in 30.8s //tensorflow/python/compiler/tensorrt/test:topk_test_cpu PASSED in 12.2s //tensorflow/python/compiler/tensorrt/test:trt_engine_op_shape_test_cpu PASSED in 13.4s //tensorflow/python/compiler/tensorrt/test:trt_mode_test_cpu PASSED in 82.3s //tensorflow/python/compiler/tensorrt/test:unary_test_cpu PASSED in 14.3s //tensorflow/python/compiler/tensorrt/test:vgg_block_nchw_test_cpu PASSED in 15.9s //tensorflow/python/compiler/tensorrt/test:vgg_block_test_cpu PASSED in 17.5s //tensorflow/python/compiler/xla:jit_compile_test_cpu PASSED in 15.8s //tensorflow/python/compiler/xla:jit_test_cpu PASSED in 19.8s //tensorflow/python/compiler/xla:xla_test_cpu PASSED in 16.8s //tensorflow/python/compiler/xla/experimental:xla_sharding_test PASSED in 9.4s //tensorflow/python/data/experimental/kernel_tests:assert_cardinality_test PASSED in 32.8s //tensorflow/python/data/experimental/kernel_tests:assert_next_test PASSED in 13.0s //tensorflow/python/data/experimental/kernel_tests:assert_prev_test PASSED in 14.1s //tensorflow/python/data/experimental/kernel_tests:compression_ops_test PASSED in 18.1s //tensorflow/python/data/experimental/kernel_tests:copy_to_device_test_cpu PASSED in 23.3s //tensorflow/python/data/experimental/kernel_tests:dense_to_sparse_batch_test PASSED in 35.0s //tensorflow/python/data/experimental/kernel_tests:io_test PASSED in 86.1s //tensorflow/python/data/experimental/kernel_tests:iterator_ops_test PASSED in 14.4s //tensorflow/python/data/experimental/kernel_tests:lookup_ops_test PASSED in 12.6s //tensorflow/python/data/experimental/kernel_tests:make_csv_dataset_test PASSED in 36.1s //tensorflow/python/data/experimental/kernel_tests:make_saveable_from_iterator_test PASSED in 11.7s //tensorflow/python/data/experimental/kernel_tests:make_tf_record_dataset_test PASSED in 86.8s //tensorflow/python/data/experimental/kernel_tests:map_defun_op_test PASSED in 12.4s //tensorflow/python/data/experimental/kernel_tests:matching_files_dataset_test PASSED in 29.0s //tensorflow/python/data/experimental/kernel_tests:model_dataset_test PASSED in 21.2s //tensorflow/python/data/experimental/kernel_tests:non_serializable_test PASSED in 22.9s //tensorflow/python/data/experimental/kernel_tests:pad_to_cardinality_test PASSED in 14.9s //tensorflow/python/data/experimental/kernel_tests:prefetch_to_device_test_cpu PASSED in 15.2s //tensorflow/python/data/experimental/kernel_tests:prefetch_with_slack_test PASSED in 16.7s //tensorflow/python/data/experimental/kernel_tests:shuffle_and_repeat_test PASSED in 34.0s //tensorflow/python/data/experimental/kernel_tests:sleep_test PASSED in 16.6s //tensorflow/python/data/experimental/kernel_tests:tf_record_writer_test PASSED in 15.8s //tensorflow/python/data/experimental/kernel_tests:variant_test PASSED in 13.6s //tensorflow/python/data/experimental/kernel_tests:wrap_unwrap_test_cpu PASSED in 12.9s //tensorflow/python/data/experimental/kernel_tests/optimization:filter_fusion_test PASSED in 55.1s //tensorflow/python/data/experimental/kernel_tests/optimization:filter_parallelization_test PASSED in 82.1s //tensorflow/python/data/experimental/kernel_tests/optimization:grappler_test_cpu PASSED in 12.4s //tensorflow/python/data/experimental/kernel_tests/optimization:make_deterministic_test PASSED in 42.4s //tensorflow/python/data/experimental/kernel_tests/optimization:map_and_batch_fusion_test PASSED in 14.2s //tensorflow/python/data/experimental/kernel_tests/optimization:map_and_filter_fusion_test PASSED in 31.8s //tensorflow/python/data/experimental/kernel_tests/optimization:map_fusion_test PASSED in 239.7s //tensorflow/python/data/experimental/kernel_tests/optimization:map_parallelization_test PASSED in 18.1s //tensorflow/python/data/experimental/kernel_tests/optimization:noop_elimination_test PASSED in 19.1s //tensorflow/python/data/experimental/kernel_tests/optimization:seq_interleave_prefetch_test PASSED in 21.6s //tensorflow/python/data/experimental/kernel_tests/service:multi_device_test PASSED in 22.4s //tensorflow/python/data/experimental/service:server_lib_test PASSED in 16.0s //tensorflow/python/data/kernel_tests:as_numpy_iterator_test PASSED in 18.6s //tensorflow/python/data/kernel_tests:bucket_by_sequence_length_test PASSED in 26.2s //tensorflow/python/data/kernel_tests:cache_test PASSED in 51.1s //tensorflow/python/data/kernel_tests:cardinality_test PASSED in 21.4s //tensorflow/python/data/kernel_tests:checkpoint_test PASSED in 27.6s //tensorflow/python/data/kernel_tests:concatenate_test PASSED in 30.8s //tensorflow/python/data/kernel_tests:counter_test PASSED in 37.2s //tensorflow/python/data/kernel_tests:dataset_spec_test PASSED in 16.1s //tensorflow/python/data/kernel_tests:dataset_test PASSED in 31.5s //tensorflow/python/data/kernel_tests:enumerate_test PASSED in 27.2s //tensorflow/python/data/kernel_tests:fingerprint_test PASSED in 15.6s //tensorflow/python/data/kernel_tests:from_sparse_tensor_slices_test PASSED in 14.2s //tensorflow/python/data/kernel_tests:get_single_element_test PASSED in 16.9s //tensorflow/python/data/kernel_tests:ignore_errors_test PASSED in 32.9s //tensorflow/python/data/kernel_tests:io_test PASSED in 30.3s //tensorflow/python/data/kernel_tests:iterator_test_cpu PASSED in 43.4s //tensorflow/python/data/kernel_tests:len_test PASSED in 14.3s //tensorflow/python/data/kernel_tests:optional_test_cpu PASSED in 20.2s //tensorflow/python/data/kernel_tests:options_test PASSED in 17.6s //tensorflow/python/data/kernel_tests:placement_test_cpu PASSED in 17.4s //tensorflow/python/data/kernel_tests:prefetch_test PASSED in 70.2s //tensorflow/python/data/kernel_tests:random_test PASSED in 40.3s //tensorflow/python/data/kernel_tests:range_test PASSED in 57.4s //tensorflow/python/data/kernel_tests:rebatch_test PASSED in 22.3s //tensorflow/python/data/kernel_tests:reduce_test_cpu PASSED in 41.5s //tensorflow/python/data/kernel_tests:scan_test_cpu PASSED in 74.9s //tensorflow/python/data/kernel_tests:sparse_batch_test PASSED in 31.7s //tensorflow/python/data/kernel_tests:unbatch_test PASSED in 35.5s //tensorflow/python/data/util:convert_test PASSED in 10.7s //tensorflow/python/data/util:nest_test PASSED in 9.9s //tensorflow/python/data/util:options_test PASSED in 10.0s //tensorflow/python/data/util:random_seed_test PASSED in 10.6s //tensorflow/python/data/util:sparse_test PASSED in 10.8s //tensorflow/python/data/util:structure_test PASSED in 10.6s //tensorflow/python/data/util:traverse_test PASSED in 9.9s //tensorflow/python/debug/cli:analyzer_cli_test_cpu PASSED in 12.4s //tensorflow/python/debug/cli:cli_config_test PASSED in 8.8s //tensorflow/python/debug/cli:cli_shared_test PASSED in 8.3s //tensorflow/python/debug/cli:command_parser_test PASSED in 8.5s //tensorflow/python/debug/cli:debugger_cli_common_test PASSED in 8.5s //tensorflow/python/debug/cli:evaluator_test PASSED in 9.2s //tensorflow/python/debug/cli:profile_analyzer_cli_test PASSED in 9.7s //tensorflow/python/debug/cli:readline_ui_test PASSED in 9.8s //tensorflow/python/debug/cli:tensor_format_test PASSED in 10.1s //tensorflow/python/debug/lib:check_numerics_callback_test_cpu PASSED in 23.1s //tensorflow/python/debug/lib:common_test PASSED in 9.8s //tensorflow/python/debug/lib:debug_data_test PASSED in 10.0s //tensorflow/python/debug/lib:debug_events_monitors_test PASSED in 10.1s //tensorflow/python/debug/lib:debug_events_writer_test PASSED in 10.1s //tensorflow/python/debug/lib:debug_gradients_test_cpu PASSED in 13.1s //tensorflow/python/debug/lib:debug_graph_reconstruction_test_cpu PASSED in 16.5s //tensorflow/python/debug/lib:debug_graphs_test PASSED in 9.7s //tensorflow/python/debug/lib:debug_grappler_test_cpu PASSED in 19.5s //tensorflow/python/debug/lib:debug_utils_test PASSED in 7.7s //tensorflow/python/debug/lib:debug_v2_ops_test_cpu PASSED in 24.7s //tensorflow/python/debug/lib:profiling_test PASSED in 7.7s //tensorflow/python/debug/lib:session_debug_file_test_cpu PASSED in 17.8s //tensorflow/python/debug/lib:session_debug_multi_gpu_test_cpu PASSED in 10.7s //tensorflow/python/debug/lib:source_utils_test PASSED in 10.4s //tensorflow/python/debug/wrappers:disk_usage_test PASSED in 9.5s //tensorflow/python/debug/wrappers:dumping_wrapper_test PASSED in 10.1s //tensorflow/python/debug/wrappers:framework_test PASSED in 9.6s //tensorflow/python/debug/wrappers:local_cli_wrapper_test PASSED in 9.4s //tensorflow/python/distribute:checkpoint_utils_test_2gpu PASSED in 15.6s //tensorflow/python/distribute:checkpoint_utils_test_cpu PASSED in 15.4s //tensorflow/python/distribute:checkpointing_test_2gpu PASSED in 20.7s //tensorflow/python/distribute:checkpointing_test_cpu PASSED in 19.6s //tensorflow/python/distribute:collective_util_test PASSED in 12.6s //tensorflow/python/distribute:combinations_test_2gpu PASSED in 27.8s //tensorflow/python/distribute:combinations_test_cpu PASSED in 27.3s //tensorflow/python/distribute:cross_device_utils_test_cpu PASSED in 15.5s //tensorflow/python/distribute:custom_training_loop_gradient_test_2gpu PASSED in 14.9s //tensorflow/python/distribute:custom_training_loop_gradient_test_cpu PASSED in 14.9s //tensorflow/python/distribute:device_util_test_cpu PASSED in 15.1s //tensorflow/python/distribute:distribute_coordinator_test PASSED in 16.7s //tensorflow/python/distribute:distribute_lib_test PASSED in 12.4s //tensorflow/python/distribute:distribute_utils_test_2gpu PASSED in 14.7s //tensorflow/python/distribute:distribute_utils_test_cpu PASSED in 15.1s //tensorflow/python/distribute:input_ops_test_cpu PASSED in 22.7s //tensorflow/python/distribute:metrics_v1_test_2gpu PASSED in 48.9s //tensorflow/python/distribute:metrics_v1_test_cpu PASSED in 54.2s //tensorflow/python/distribute:mirrored_values_test_2gpu PASSED in 18.8s //tensorflow/python/distribute:mirrored_values_test_cpu PASSED in 17.9s //tensorflow/python/distribute:mirrored_variable_test_2gpu PASSED in 30.0s //tensorflow/python/distribute:mirrored_variable_test_cpu PASSED in 38.1s //tensorflow/python/distribute:multi_process_runner_no_init_test PASSED in 10.2s //tensorflow/python/distribute:multi_worker_continuous_run_test_cpu PASSED in 33.9s //tensorflow/python/distribute:multi_worker_util_test PASSED in 8.5s //tensorflow/python/distribute:mwms_pjrt_gpu_test_2gpu PASSED in 12.9s //tensorflow/python/distribute:mwms_pjrt_gpu_test_cpu PASSED in 13.0s //tensorflow/python/distribute:numpy_dataset_test PASSED in 9.3s //tensorflow/python/distribute:one_device_strategy_test_cpu PASSED in 27.4s //tensorflow/python/distribute:packed_distributed_variable_test PASSED in 11.3s //tensorflow/python/distribute:parameter_server_strategy_test_2gpu PASSED in 32.6s //tensorflow/python/distribute:parameter_server_strategy_test_cpu PASSED in 32.4s //tensorflow/python/distribute:parameter_server_strategy_v2_test_2gpu PASSED in 38.5s //tensorflow/python/distribute:parameter_server_strategy_v2_test_cpu PASSED in 40.6s //tensorflow/python/distribute:per_replica_test_2gpu PASSED in 26.5s //tensorflow/python/distribute:per_replica_test_cpu PASSED in 16.9s //tensorflow/python/distribute:ps_values_test_2gpu PASSED in 24.1s //tensorflow/python/distribute:ps_values_test_cpu PASSED in 21.8s //tensorflow/python/distribute:remote_mirrored_strategy_eager_test_cpu PASSED in 25.2s //tensorflow/python/distribute:sharded_variable_test PASSED in 36.5s //tensorflow/python/distribute:shared_variable_creator_test PASSED in 8.4s //tensorflow/python/distribute:strategy_combinations_test_cpu PASSED in 52.4s //tensorflow/python/distribute:template_mirrored_strategy_test_cpu PASSED in 13.3s //tensorflow/python/distribute:test_util_test_2gpu PASSED in 21.5s //tensorflow/python/distribute:test_util_test_cpu PASSED in 23.9s //tensorflow/python/distribute:tf_function_test_2gpu PASSED in 15.4s //tensorflow/python/distribute:tf_function_test_cpu PASSED in 15.2s //tensorflow/python/distribute:values_v2_test_cpu PASSED in 14.5s //tensorflow/python/distribute:warm_starting_util_test_2gpu PASSED in 15.2s //tensorflow/python/distribute:warm_starting_util_test_cpu PASSED in 15.0s //tensorflow/python/distribute/cluster_resolver:base_cluster_resolver_py_test PASSED in 10.8s //tensorflow/python/distribute/cluster_resolver:gce_cluster_resolver_py_test PASSED in 9.6s //tensorflow/python/distribute/cluster_resolver:kubernetes_cluster_resolver_py_test PASSED in 10.1s //tensorflow/python/distribute/cluster_resolver:sagemaker_cluster_resolver_py_test PASSED in 9.9s //tensorflow/python/distribute/cluster_resolver:slurm_cluster_resolver_py_test PASSED in 9.8s //tensorflow/python/distribute/cluster_resolver:tfconfig_cluster_resolver_py_test PASSED in 11.8s //tensorflow/python/distribute/cluster_resolver/tpu:tpu_cluster_resolver_py_test PASSED in 15.3s //tensorflow/python/distribute/coordinator:watchdog_test PASSED in 63.8s //tensorflow/python/distribute/experimental:dtensor_util_test_cpu PASSED in 13.2s //tensorflow/python/distribute/experimental:mirrored_strategy_test_cpu PASSED in 32.0s //tensorflow/python/distribute/experimental:multi_worker_mirrored_strategy_test_cpu PASSED in 19.7s //tensorflow/python/distribute/integration_test:saved_model_test_cpu PASSED in 55.3s //tensorflow/python/distribute/parallel_device:parallel_device_test_cpu PASSED in 16.9s //tensorflow/python/distribute/v1:all_reduce_test PASSED in 58.8s //tensorflow/python/distribute/v1:cross_device_ops_test_cpu PASSED in 93.7s //tensorflow/python/dlpack:dlpack_test_cpu PASSED in 9.9s //tensorflow/python/eager:backprop_test_cpu PASSED in 170.0s //tensorflow/python/eager:cancellation_test_cpu PASSED in 10.5s //tensorflow/python/eager:context_test_cpu PASSED in 14.1s //tensorflow/python/eager:core_test_cpu PASSED in 28.8s //tensorflow/python/eager:gradient_input_output_exclusions_test PASSED in 46.1s //tensorflow/python/eager:graph_only_ops_test_cpu PASSED in 11.9s //tensorflow/python/eager:lift_to_graph_test PASSED in 12.6s //tensorflow/python/eager:monitoring_test_cpu PASSED in 12.9s //tensorflow/python/eager:ops_test_cpu PASSED in 10.3s //tensorflow/python/eager:profiler_client_test PASSED in 8.7s //tensorflow/python/eager:profiler_test_cpu PASSED in 11.6s //tensorflow/python/eager:pywrap_tfe_test PASSED in 29.8s //tensorflow/python/eager:record_test PASSED in 11.2s //tensorflow/python/eager:run_eager_op_as_function_test_cpu PASSED in 11.9s //tensorflow/python/eager:run_eager_op_as_function_xla_test_cpu PASSED in 9.6s //tensorflow/python/eager:tensor_test_cpu PASSED in 18.0s //tensorflow/python/eager:wrap_function_device_test_cpu PASSED in 13.0s //tensorflow/python/eager:wrap_function_test PASSED in 13.0s //tensorflow/python/eager/memory_tests:remote_memory_test_cpu PASSED in 10.8s //tensorflow/python/eager/polymorphic_function:argument_naming_test_cpu PASSED in 17.1s //tensorflow/python/eager/polymorphic_function:atomic_function_test_cpu PASSED in 13.2s //tensorflow/python/eager/polymorphic_function:collection_test_cpu PASSED in 13.5s //tensorflow/python/eager/polymorphic_function:compiler_ir_test_cpu PASSED in 66.1s //tensorflow/python/eager/polymorphic_function:compiler_ir_test_cpu_mlir_bridge_test PASSED in 75.4s //tensorflow/python/eager/polymorphic_function:concrete_function_test_cpu PASSED in 17.9s //tensorflow/python/eager/polymorphic_function:function_spec_test PASSED in 8.7s //tensorflow/python/eager/polymorphic_function:polymorphic_function_xla_test_cpu PASSED in 40.0s //tensorflow/python/eager/polymorphic_function:tracing_compilation_test PASSED in 21.5s //tensorflow/python/feature_column:sequence_feature_column_integration_test PASSED in 13.7s //tensorflow/python/feature_column:serialization_test PASSED in 20.0s //tensorflow/python/framework:auto_control_deps_test PASSED in 38.9s //tensorflow/python/framework:c_api_util_test PASSED in 13.0s //tensorflow/python/framework:common_shapes_test PASSED in 10.9s //tensorflow/python/framework:composite_tensor_test PASSED in 9.7s //tensorflow/python/framework:config_test_2gpu PASSED in 14.6s //tensorflow/python/framework:config_test_cpu PASSED in 16.2s //tensorflow/python/framework:constant_op_test PASSED in 12.7s //tensorflow/python/framework:device_spec_test PASSED in 10.1s //tensorflow/python/framework:device_test PASSED in 10.2s //tensorflow/python/framework:dtypes_test PASSED in 21.6s //tensorflow/python/framework:error_interpolation_test PASSED in 10.5s //tensorflow/python/framework:errors_test PASSED in 12.3s //tensorflow/python/framework:extension_type_field_test PASSED in 10.2s //tensorflow/python/framework:extension_type_test PASSED in 40.1s //tensorflow/python/framework:file_system_test PASSED in 11.6s //tensorflow/python/framework:flexible_dtypes_test PASSED in 100.7s //tensorflow/python/framework:function_def_to_graph_test PASSED in 16.4s //tensorflow/python/framework:graph_util_test PASSED in 13.3s //tensorflow/python/framework:immutable_dict_test PASSED in 10.2s //tensorflow/python/framework:importer_test PASSED in 14.0s //tensorflow/python/framework:indexed_slices_test PASSED in 11.9s //tensorflow/python/framework:kernels_test PASSED in 10.2s //tensorflow/python/framework:meta_graph_test PASSED in 16.7s //tensorflow/python/framework:node_file_writer_test_cpu PASSED in 11.5s //tensorflow/python/framework:offset_counter_helper_test PASSED in 1.1s //tensorflow/python/framework:op_allowlist_namespace_test PASSED in 3.6s //tensorflow/python/framework:op_callbacks_test_cpu PASSED in 16.5s //tensorflow/python/framework:op_def_library_test PASSED in 11.7s //tensorflow/python/framework:op_def_util_test PASSED in 10.8s //tensorflow/python/framework:ops_enable_eager_test PASSED in 3.9s //tensorflow/python/framework:ops_test PASSED in 26.4s //tensorflow/python/framework:proto_test PASSED in 9.5s //tensorflow/python/framework:py_context_manager_test PASSED in 12.2s //tensorflow/python/framework:python_api_dispatcher_test PASSED in 12.9s //tensorflow/python/framework:python_api_info_test PASSED in 14.3s //tensorflow/python/framework:python_api_parameter_converter_test PASSED in 13.9s //tensorflow/python/framework:python_op_gen_annotation_test PASSED in 7.5s //tensorflow/python/framework:python_op_gen_annotator_test PASSED in 0.1s //tensorflow/python/framework:python_op_gen_test PASSED in 0.1s //tensorflow/python/framework:python_tensor_converter_test PASSED in 12.8s //tensorflow/python/framework:random_seed_test PASSED in 12.0s //tensorflow/python/framework:registry_test PASSED in 11.2s //tensorflow/python/framework:smart_cond_test PASSED in 13.3s //tensorflow/python/framework:sparse_tensor_test PASSED in 11.9s //tensorflow/python/framework:subscribe_test PASSED in 13.7s //tensorflow/python/framework:tensor_shape_test PASSED in 11.7s //tensorflow/python/framework:tensor_test PASSED in 10.7s //tensorflow/python/framework:tensor_util_test PASSED in 14.1s //tensorflow/python/framework:test_combinations_test PASSED in 9.1s //tensorflow/python/framework:test_util_test_cpu PASSED in 21.4s //tensorflow/python/framework:tf2_test PASSED in 9.5s //tensorflow/python/framework:traceable_stack_test PASSED in 12.3s //tensorflow/python/framework:type_spec_test PASSED in 12.7s //tensorflow/python/framework:versions_test PASSED in 11.1s //tensorflow/python/framework:weak_tensor_test PASSED in 16.8s //tensorflow/python/framework/experimental:unified_api_test_cpu PASSED in 24.4s //tensorflow/python/grappler:arithmetic_optimizer_test_cpu PASSED in 13.3s //tensorflow/python/grappler:auto_mixed_precision_test_cpu PASSED in 16.7s //tensorflow/python/grappler:constant_folding_test_cpu PASSED in 16.3s //tensorflow/python/grappler:cost_analyzer_test PASSED in 18.6s //tensorflow/python/grappler:datasets_test PASSED in 16.1s //tensorflow/python/grappler:item_test PASSED in 12.6s //tensorflow/python/grappler:memory_optimizer_test PASSED in 23.8s //tensorflow/python/grappler:model_analyzer_test PASSED in 10.0s //tensorflow/python/grappler:remapper_test_cpu PASSED in 14.4s //tensorflow/python/grappler:tf_optimizer_test PASSED in 10.9s //tensorflow/python/kernel_tests:benchmark_test_cpu PASSED in 13.1s //tensorflow/python/kernel_tests:check_ops_test_cpu PASSED in 30.6s //tensorflow/python/kernel_tests:collective_ops_multi_worker_test PASSED in 34.6s //tensorflow/python/kernel_tests:composite_tensor_ops_test PASSED in 13.5s //tensorflow/python/kernel_tests:critical_section_test_cpu PASSED in 28.9s //tensorflow/python/kernel_tests:garbage_collection_test PASSED in 11.6s //tensorflow/python/kernel_tests:gradient_correctness_test_cpu PASSED in 10.0s //tensorflow/python/kernel_tests:histogram_ops_test_cpu PASSED in 13.0s //tensorflow/python/kernel_tests:logging_ops_test_cpu PASSED in 12.0s //tensorflow/python/kernel_tests:numerics_test_cpu PASSED in 10.7s //tensorflow/python/kernel_tests:template_test PASSED in 16.3s //tensorflow/python/kernel_tests:trace_op_test_cpu PASSED in 10.1s //tensorflow/python/kernel_tests/array_ops:batch_gather_op_test_cpu PASSED in 12.5s //tensorflow/python/kernel_tests/array_ops:batch_scatter_ops_test PASSED in 11.1s //tensorflow/python/kernel_tests/array_ops:batchtospace_op_test_cpu PASSED in 16.8s //tensorflow/python/kernel_tests/array_ops:bcast_ops_test PASSED in 9.6s //tensorflow/python/kernel_tests/array_ops:bitcast_op_test_cpu PASSED in 12.5s //tensorflow/python/kernel_tests/array_ops:broadcast_to_ops_test_cpu PASSED in 37.9s //tensorflow/python/kernel_tests/array_ops:cast_op_test_cpu PASSED in 13.5s //tensorflow/python/kernel_tests/array_ops:constant_op_eager_test_cpu PASSED in 11.1s //tensorflow/python/kernel_tests/array_ops:constant_op_test_cpu PASSED in 16.4s //tensorflow/python/kernel_tests/array_ops:denormal_test_cpu PASSED in 12.9s //tensorflow/python/kernel_tests/array_ops:depthtospace_op_test_cpu PASSED in 12.4s //tensorflow/python/kernel_tests/array_ops:edit_distance_op_test PASSED in 9.5s //tensorflow/python/kernel_tests/array_ops:fingerprint_op_test PASSED in 9.3s //tensorflow/python/kernel_tests/array_ops:gather_nd_op_test_cpu PASSED in 9.6s //tensorflow/python/kernel_tests/array_ops:identity_n_op_py_test PASSED in 10.2s //tensorflow/python/kernel_tests/array_ops:identity_op_py_test PASSED in 51.3s //tensorflow/python/kernel_tests/array_ops:large_concat_op_test_cpu PASSED in 14.8s //tensorflow/python/kernel_tests/array_ops:manip_ops_test_cpu PASSED in 27.7s //tensorflow/python/kernel_tests/array_ops:one_hot_op_test_cpu PASSED in 13.3s //tensorflow/python/kernel_tests/array_ops:pad_op_test_cpu PASSED in 19.3s //tensorflow/python/kernel_tests/array_ops:reshape_op_test_cpu PASSED in 11.0s //tensorflow/python/kernel_tests/array_ops:reverse_sequence_op_test_cpu PASSED in 11.2s //tensorflow/python/kernel_tests/array_ops:scalar_test_cpu PASSED in 11.8s //tensorflow/python/kernel_tests/array_ops:shape_ops_test_cpu PASSED in 16.7s //tensorflow/python/kernel_tests/array_ops:slice_op_test_cpu PASSED in 12.5s //tensorflow/python/kernel_tests/array_ops:spacetobatch_op_test_cpu PASSED in 23.4s //tensorflow/python/kernel_tests/array_ops:spacetodepth_op_test_cpu PASSED in 13.7s //tensorflow/python/kernel_tests/array_ops:stack_op_test_cpu PASSED in 20.5s //tensorflow/python/kernel_tests/array_ops:unique_op_test_cpu PASSED in 11.6s //tensorflow/python/kernel_tests/array_ops:unstack_op_test_cpu PASSED in 11.9s //tensorflow/python/kernel_tests/array_ops:where_op_test_cpu PASSED in 21.8s //tensorflow/python/kernel_tests/control_flow:cond_v2_test_cpu PASSED in 78.8s //tensorflow/python/kernel_tests/control_flow:control_flow_util_test PASSED in 10.5s //tensorflow/python/kernel_tests/control_flow:control_flow_util_v2_test PASSED in 11.6s //tensorflow/python/kernel_tests/control_flow:py_func_test_cpu PASSED in 25.5s //tensorflow/python/kernel_tests/control_flow:scan_ops_test_cpu PASSED in 106.0s //tensorflow/python/kernel_tests/control_flow:while_v2_test_cpu PASSED in 92.3s //tensorflow/python/kernel_tests/custom_ops:ackermann_test PASSED in 13.6s //tensorflow/python/kernel_tests/custom_ops:duplicate_op_test PASSED in 12.1s //tensorflow/python/kernel_tests/custom_ops:invalid_op_test PASSED in 12.0s //tensorflow/python/kernel_tests/data_structures:conditional_accumulator_test PASSED in 13.5s //tensorflow/python/kernel_tests/data_structures:dynamic_partition_op_test_2gpu PASSED in 16.7s //tensorflow/python/kernel_tests/data_structures:dynamic_partition_op_test_cpu PASSED in 16.3s //tensorflow/python/kernel_tests/data_structures:dynamic_stitch_op_test_cpu PASSED in 10.6s //tensorflow/python/kernel_tests/data_structures:fifo_queue_test PASSED in 11.0s //tensorflow/python/kernel_tests/data_structures:list_ops_test_cpu PASSED in 34.2s //tensorflow/python/kernel_tests/data_structures:listdiff_op_test PASSED in 10.9s //tensorflow/python/kernel_tests/data_structures:lookup_ops_test PASSED in 38.3s //tensorflow/python/kernel_tests/data_structures:map_ops_test PASSED in 19.7s //tensorflow/python/kernel_tests/data_structures:padding_fifo_queue_test_cpu PASSED in 10.2s //tensorflow/python/kernel_tests/data_structures:priority_queue_test PASSED in 9.5s //tensorflow/python/kernel_tests/data_structures:stack_ops_test_cpu PASSED in 10.3s //tensorflow/python/kernel_tests/data_structures:stage_op_test_cpu PASSED in 13.0s //tensorflow/python/kernel_tests/distributions:bernoulli_test_cpu PASSED in 21.4s //tensorflow/python/kernel_tests/distributions:bijector_test_cpu PASSED in 15.4s //tensorflow/python/kernel_tests/distributions:categorical_test_cpu PASSED in 15.9s //tensorflow/python/kernel_tests/distributions:dirichlet_multinomial_test_cpu PASSED in 17.0s //tensorflow/python/kernel_tests/distributions:dirichlet_test_cpu PASSED in 19.0s //tensorflow/python/kernel_tests/distributions:exponential_test_cpu PASSED in 25.8s //tensorflow/python/kernel_tests/distributions:gamma_test_cpu PASSED in 62.5s //tensorflow/python/kernel_tests/distributions:identity_bijector_test_cpu PASSED in 13.2s //tensorflow/python/kernel_tests/distributions:kullback_leibler_test_cpu PASSED in 12.6s //tensorflow/python/kernel_tests/distributions:laplace_test_cpu PASSED in 46.8s //tensorflow/python/kernel_tests/distributions:multinomial_test_cpu PASSED in 15.1s //tensorflow/python/kernel_tests/distributions:normal_test_cpu PASSED in 37.3s //tensorflow/python/kernel_tests/distributions:special_math_test_cpu PASSED in 30.7s //tensorflow/python/kernel_tests/distributions:uniform_test_cpu PASSED in 21.3s //tensorflow/python/kernel_tests/image_ops:attention_ops_test PASSED in 13.2s //tensorflow/python/kernel_tests/image_ops:decode_bmp_op_test PASSED in 11.2s //tensorflow/python/kernel_tests/image_ops:decode_compressed_op_test PASSED in 10.7s //tensorflow/python/kernel_tests/image_ops:decode_image_op_test PASSED in 11.0s //tensorflow/python/kernel_tests/image_ops:decode_png_op_test PASSED in 11.1s //tensorflow/python/kernel_tests/image_ops:decode_raw_op_test PASSED in 12.1s //tensorflow/python/kernel_tests/image_ops:draw_bounding_box_op_test_cpu PASSED in 12.8s //tensorflow/python/kernel_tests/image_ops:extract_image_patches_op_test_cpu PASSED in 9.1s //tensorflow/python/kernel_tests/image_ops:extract_volume_patches_op_test_cpu PASSED in 10.6s //tensorflow/python/kernel_tests/io_ops:checkpoint_ops_test PASSED in 13.6s //tensorflow/python/kernel_tests/io_ops:decode_csv_op_test PASSED in 9.7s //tensorflow/python/kernel_tests/io_ops:io_ops_test PASSED in 10.9s //tensorflow/python/kernel_tests/io_ops:parse_single_example_op_test PASSED in 13.0s //tensorflow/python/kernel_tests/io_ops:parsing_ops_test PASSED in 42.8s //tensorflow/python/kernel_tests/io_ops:reader_ops_test PASSED in 14.5s //tensorflow/python/kernel_tests/io_ops:record_input_test PASSED in 57.8s //tensorflow/python/kernel_tests/io_ops:save_restore_ops_test PASSED in 10.0s //tensorflow/python/kernel_tests/linalg:determinant_op_test_cpu PASSED in 9.9s //tensorflow/python/kernel_tests/linalg:linear_operator_addition_test_cpu PASSED in 12.9s //tensorflow/python/kernel_tests/linalg:linear_operator_test_cpu PASSED in 18.7s //tensorflow/python/kernel_tests/linalg:lu_op_test_cpu PASSED in 13.4s //tensorflow/python/kernel_tests/linalg:matrix_inverse_op_test_cpu PASSED in 12.8s //tensorflow/python/kernel_tests/linalg:matrix_logarithm_op_test PASSED in 78.1s //tensorflow/python/kernel_tests/linalg:matrix_solve_ls_op_test_cpu PASSED in 63.7s //tensorflow/python/kernel_tests/linalg:matrix_solve_op_test_cpu PASSED in 29.4s //tensorflow/python/kernel_tests/linalg:matrix_square_root_op_test_cpu PASSED in 12.3s //tensorflow/python/kernel_tests/linalg:slicing_test_cpu PASSED in 17.4s //tensorflow/python/kernel_tests/linalg/sparse:conjugate_gradient_test_cpu PASSED in 16.5s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_test_cpu PASSED in 11.3s //tensorflow/python/kernel_tests/math_ops:aggregate_ops_test_cpu PASSED in 14.2s //tensorflow/python/kernel_tests/math_ops:argmax_op_test_cpu PASSED in 12.0s //tensorflow/python/kernel_tests/math_ops:banded_triangular_solve_op_test_cpu PASSED in 17.5s //tensorflow/python/kernel_tests/math_ops:basic_gpu_test_cpu PASSED in 10.9s //tensorflow/python/kernel_tests/math_ops:bincount_op_test_cpu PASSED in 12.4s //tensorflow/python/kernel_tests/math_ops:bucketize_op_test_cpu PASSED in 9.7s //tensorflow/python/kernel_tests/math_ops:clip_ops_test PASSED in 11.5s //tensorflow/python/kernel_tests/math_ops:confusion_matrix_test PASSED in 17.5s //tensorflow/python/kernel_tests/math_ops:cross_grad_test_cpu PASSED in 11.0s //tensorflow/python/kernel_tests/math_ops:cumulative_logsumexp_test_cpu PASSED in 14.5s //tensorflow/python/kernel_tests/math_ops:in_topk_op_test_cpu PASSED in 11.4s //tensorflow/python/kernel_tests/math_ops:segment_reduction_ops_d9m_test_cpu PASSED in 11.6s //tensorflow/python/kernel_tests/math_ops:sets_test PASSED in 37.9s //tensorflow/python/kernel_tests/math_ops:topk_op_test_cpu PASSED in 13.1s //tensorflow/python/kernel_tests/math_ops:zero_division_test_cpu PASSED in 11.6s //tensorflow/python/kernel_tests/nn_ops:betainc_op_test_cpu PASSED in 15.5s //tensorflow/python/kernel_tests/nn_ops:bias_op_test_cpu PASSED in 208.7s //tensorflow/python/kernel_tests/nn_ops:conv1d_test_cpu PASSED in 10.7s //tensorflow/python/kernel_tests/nn_ops:conv1d_transpose_test_cpu PASSED in 11.7s //tensorflow/python/kernel_tests/nn_ops:conv2d_transpose_test_cpu PASSED in 12.0s //tensorflow/python/kernel_tests/nn_ops:conv3d_backprop_filter_v2_grad_test_cpu PASSED in 28.7s //tensorflow/python/kernel_tests/nn_ops:conv3d_transpose_test_cpu PASSED in 15.3s //tensorflow/python/kernel_tests/nn_ops:ctc_decoder_ops_test PASSED in 10.1s //tensorflow/python/kernel_tests/nn_ops:ctc_loss_op_test_cpu PASSED in 119.9s //tensorflow/python/kernel_tests/nn_ops:cudnn_d9m_test_cpu PASSED in 9.6s //tensorflow/python/kernel_tests/nn_ops:cudnn_deterministic_ops_test_cpu PASSED in 12.6s //tensorflow/python/kernel_tests/nn_ops:losses_test PASSED in 52.6s //tensorflow/python/kernel_tests/nn_ops:lrn_op_test_cpu PASSED in 15.0s //tensorflow/python/kernel_tests/nn_ops:morphological_ops_test_cpu PASSED in 18.1s //tensorflow/python/kernel_tests/nn_ops:nth_element_op_test_cpu PASSED in 12.4s //tensorflow/python/kernel_tests/nn_ops:pool_test_cpu PASSED in 58.7s //tensorflow/python/kernel_tests/nn_ops:pooling_ops_3d_test_cpu PASSED in 41.6s //tensorflow/python/kernel_tests/nn_ops:relu_op_test_cpu PASSED in 13.9s //tensorflow/python/kernel_tests/nn_ops:softmax_op_test_cpu PASSED in 11.8s //tensorflow/python/kernel_tests/nn_ops:softplus_op_test_cpu PASSED in 11.4s //tensorflow/python/kernel_tests/nn_ops:softsign_op_test_cpu PASSED in 10.2s //tensorflow/python/kernel_tests/nn_ops:xent_op_d9m_test_cpu PASSED in 183.0s //tensorflow/python/kernel_tests/nn_ops:xent_op_test_cpu PASSED in 13.8s //tensorflow/python/kernel_tests/proto:decode_proto_op_test PASSED in 12.8s //tensorflow/python/kernel_tests/proto:descriptor_source_test PASSED in 11.2s //tensorflow/python/kernel_tests/proto:encode_proto_op_test PASSED in 13.4s //tensorflow/python/kernel_tests/quantization_ops:quantization_ops_test PASSED in 11.7s //tensorflow/python/kernel_tests/random:candidate_sampler_ops_test PASSED in 13.6s //tensorflow/python/kernel_tests/random:multinomial_op_test_cpu PASSED in 11.3s //tensorflow/python/kernel_tests/random:parameterized_truncated_normal_op_test_cpu PASSED in 19.8s //tensorflow/python/kernel_tests/random:random_crop_test_cpu PASSED in 15.9s //tensorflow/python/kernel_tests/random:random_grad_test_cpu PASSED in 19.0s //tensorflow/python/kernel_tests/random:random_ops_test_cpu PASSED in 27.0s //tensorflow/python/kernel_tests/random:random_poisson_test_cpu PASSED in 19.7s //tensorflow/python/kernel_tests/random:random_shuffle_queue_test PASSED in 11.9s //tensorflow/python/kernel_tests/random:stateful_random_ops_test_cpu PASSED in 29.2s //tensorflow/python/kernel_tests/signal:mel_ops_test_cpu PASSED in 21.9s //tensorflow/python/kernel_tests/signal:mfcc_ops_test_cpu PASSED in 13.0s //tensorflow/python/kernel_tests/signal:reconstruction_ops_test_cpu PASSED in 18.8s //tensorflow/python/kernel_tests/signal:shape_ops_test_cpu PASSED in 35.0s //tensorflow/python/kernel_tests/sparse_ops:sparse_add_op_test PASSED in 16.4s //tensorflow/python/kernel_tests/sparse_ops:sparse_concat_op_test PASSED in 11.6s //tensorflow/python/kernel_tests/sparse_ops:sparse_conditional_accumulator_test PASSED in 13.4s //tensorflow/python/kernel_tests/sparse_ops:sparse_cross_op_test PASSED in 22.6s //tensorflow/python/kernel_tests/sparse_ops:sparse_matmul_op_test_cpu PASSED in 64.3s //tensorflow/python/kernel_tests/sparse_ops:sparse_reorder_op_test PASSED in 13.6s //tensorflow/python/kernel_tests/sparse_ops:sparse_reshape_op_test PASSED in 13.0s //tensorflow/python/kernel_tests/sparse_ops:sparse_serialization_ops_test PASSED in 14.8s //tensorflow/python/kernel_tests/sparse_ops:sparse_slice_op_test PASSED in 14.3s //tensorflow/python/kernel_tests/sparse_ops:sparse_split_op_test_cpu PASSED in 13.9s //tensorflow/python/kernel_tests/sparse_ops:sparse_tensor_dense_matmul_grad_test_cpu PASSED in 32.7s //tensorflow/python/kernel_tests/sparse_ops:sparse_tensor_dense_matmul_op_d9m_test_cpu PASSED in 57.3s //tensorflow/python/kernel_tests/sparse_ops:sparse_tensor_dense_matmul_op_test_cpu PASSED in 47.2s //tensorflow/python/kernel_tests/sparse_ops:sparse_tensors_map_ops_test PASSED in 15.0s //tensorflow/python/kernel_tests/sparse_ops:sparse_to_dense_op_py_test_cpu PASSED in 13.2s //tensorflow/python/kernel_tests/sparse_ops:sparse_xent_op_d9m_test_cpu PASSED in 131.6s //tensorflow/python/kernel_tests/sparse_ops:sparse_xent_op_test_cpu PASSED in 16.6s //tensorflow/python/kernel_tests/sparse_ops:sparsemask_op_test PASSED in 12.2s //tensorflow/python/kernel_tests/strings_ops:as_string_op_test PASSED in 11.2s //tensorflow/python/kernel_tests/strings_ops:base64_ops_test PASSED in 20.0s //tensorflow/python/kernel_tests/strings_ops:reduce_join_op_test_cpu PASSED in 13.9s //tensorflow/python/kernel_tests/strings_ops:regex_full_match_op_test PASSED in 15.2s //tensorflow/python/kernel_tests/strings_ops:regex_replace_op_test PASSED in 13.5s //tensorflow/python/kernel_tests/strings_ops:string_bytes_split_op_test PASSED in 14.3s //tensorflow/python/kernel_tests/strings_ops:string_format_op_test PASSED in 16.8s //tensorflow/python/kernel_tests/strings_ops:string_join_op_test PASSED in 10.3s //tensorflow/python/kernel_tests/strings_ops:string_length_op_test PASSED in 11.1s //tensorflow/python/kernel_tests/strings_ops:string_lower_op_test PASSED in 12.0s //tensorflow/python/kernel_tests/strings_ops:string_split_op_test PASSED in 16.7s //tensorflow/python/kernel_tests/strings_ops:string_strip_op_test PASSED in 14.1s //tensorflow/python/kernel_tests/strings_ops:string_to_hash_bucket_op_test_cpu PASSED in 16.2s //tensorflow/python/kernel_tests/strings_ops:string_to_number_op_test_cpu PASSED in 14.7s //tensorflow/python/kernel_tests/strings_ops:string_upper_op_test PASSED in 13.8s //tensorflow/python/kernel_tests/strings_ops:substr_op_test PASSED in 15.7s //tensorflow/python/kernel_tests/strings_ops:unicode_decode_op_test PASSED in 23.9s //tensorflow/python/kernel_tests/strings_ops:unicode_encode_op_test PASSED in 15.8s //tensorflow/python/kernel_tests/strings_ops:unicode_script_op_test PASSED in 15.5s //tensorflow/python/kernel_tests/strings_ops:unicode_transcode_op_test PASSED in 14.0s //tensorflow/python/kernel_tests/strings_ops:unsorted_segment_join_op_test_cpu PASSED in 14.2s //tensorflow/python/kernel_tests/summary_ops:summary_ops_test_cpu PASSED in 23.7s //tensorflow/python/kernel_tests/summary_ops:summary_v1_audio_op_test_cpu PASSED in 14.7s //tensorflow/python/kernel_tests/summary_ops:summary_v1_image_op_test_cpu PASSED in 13.6s //tensorflow/python/kernel_tests/summary_ops:summary_v1_ops_test PASSED in 13.2s //tensorflow/python/kernel_tests/summary_ops:summary_v1_tensor_op_test PASSED in 14.1s //tensorflow/python/kernel_tests/v1_compat_tests:array_ops_test_cpu PASSED in 13.0s //tensorflow/python/kernel_tests/v1_compat_tests:dense_update_ops_test_cpu PASSED in 10.7s //tensorflow/python/kernel_tests/v1_compat_tests:identity_op_py_test PASSED in 10.8s //tensorflow/python/kernel_tests/v1_compat_tests:scatter_nd_ops_test_cpu PASSED in 10.3s //tensorflow/python/kernel_tests/v1_compat_tests:session_ops_test_cpu PASSED in 11.9s //tensorflow/python/kernel_tests/v1_compat_tests:stack_op_test_cpu PASSED in 10.4s //tensorflow/python/kernel_tests/variables:dense_update_ops_no_tsan_test_cpu PASSED in 11.6s //tensorflow/python/kernel_tests/variables:dense_update_ops_test_cpu PASSED in 10.9s //tensorflow/python/kernel_tests/variables:partitioned_variables_test PASSED in 15.0s //tensorflow/python/kernel_tests/variables:resource_variable_ops_test_cpu PASSED in 65.2s //tensorflow/python/kernel_tests/variables:variable_ops_test_cpu PASSED in 11.0s //tensorflow/python/kernel_tests/variables:variable_scope_test PASSED in 34.2s //tensorflow/python/kernel_tests/variables:variables_test PASSED in 17.6s //tensorflow/python/lib/io:file_io_test PASSED in 17.1s //tensorflow/python/lib/io:tf_record_test PASSED in 14.0s //tensorflow/python/module:module_test PASSED in 15.2s //tensorflow/python/ops:array_grad_test_cpu PASSED in 14.7s //tensorflow/python/ops:array_ops_shape_test PASSED in 8.9s //tensorflow/python/ops:array_ops_test PASSED in 9.4s //tensorflow/python/ops:autograph_ops_test PASSED in 9.2s //tensorflow/python/ops:bincount_ops_test_cpu PASSED in 14.3s //tensorflow/python/ops:bitwise_ops_test_cpu PASSED in 13.2s //tensorflow/python/ops:clip_ops_test PASSED in 12.4s //tensorflow/python/ops:clustering_ops_test PASSED in 34.9s //tensorflow/python/ops:collective_ops_gpu_test_cpu PASSED in 11.8s //tensorflow/python/ops:collective_ops_test PASSED in 25.4s //tensorflow/python/ops:collective_ops_xla_test PASSED in 11.9s //tensorflow/python/ops:compiled_collective_ops_gpu_test_2gpu PASSED in 12.1s //tensorflow/python/ops:compiled_collective_ops_gpu_test_cpu PASSED in 12.3s //tensorflow/python/ops:control_flow_v2_enable_test PASSED in 10.3s //tensorflow/python/ops:control_flow_v2_toggles_test PASSED in 11.7s //tensorflow/python/ops:dequantize_op_test PASSED in 12.1s //tensorflow/python/ops:embedding_ops_test_cpu PASSED in 15.8s //tensorflow/python/ops:factory_ops_test_cpu PASSED in 21.3s //tensorflow/python/ops:functional_ops_test PASSED in 8.7s //tensorflow/python/ops:gradient_checker_v2_test_cpu PASSED in 33.8s //tensorflow/python/ops:gradients_test_cpu PASSED in 25.3s //tensorflow/python/ops:init_ops_test_cpu PASSED in 11.1s //tensorflow/python/ops:init_ops_v2_test_cpu PASSED in 14.4s //tensorflow/python/ops:lookup_ops_async_checkpoint_test PASSED in 12.8s //tensorflow/python/ops:math_grad_test_cpu PASSED in 24.0s //tensorflow/python/ops:math_ops_linspace_test_cpu PASSED in 11.8s //tensorflow/python/ops:math_ops_test_cpu PASSED in 29.4s //tensorflow/python/ops:nn_grad_test_cpu PASSED in 13.6s //tensorflow/python/ops:nn_loss_scaling_utilities_test PASSED in 12.4s //tensorflow/python/ops:nn_test_cpu PASSED in 79.0s //tensorflow/python/ops:nn_xent_test_cpu PASSED in 15.0s //tensorflow/python/ops:op_selector_test PASSED in 8.8s //tensorflow/python/ops:quantized_conv_ops_test PASSED in 10.5s //tensorflow/python/ops:quantized_ops_test PASSED in 9.2s //tensorflow/python/ops:raw_ops_test_cpu PASSED in 11.2s //tensorflow/python/ops:rnn_grad_test_cpu PASSED in 11.1s //tensorflow/python/ops:script_ops_test PASSED in 8.6s //tensorflow/python/ops:sort_ops_test PASSED in 11.2s //tensorflow/python/ops:sparse_bincount_ops_test_cpu PASSED in 34.4s //tensorflow/python/ops:sparse_ops_test PASSED in 19.6s //tensorflow/python/ops:tensor_array_ops_test PASSED in 8.8s //tensorflow/python/ops:variable_spec_test PASSED in 10.9s //tensorflow/python/ops:weak_tensor_array_ops_test PASSED in 8.7s //tensorflow/python/ops:weak_tensor_constant_op_test PASSED in 16.4s //tensorflow/python/ops:weak_tensor_image_ops_test PASSED in 8.2s //tensorflow/python/ops:weak_tensor_math_ops_test PASSED in 20.8s //tensorflow/python/ops:weak_tensor_nn_test_cpu PASSED in 23.1s //tensorflow/python/ops:weak_tensor_np_array_ops_test PASSED in 33.9s //tensorflow/python/ops:weak_tensor_np_math_ops_test PASSED in 9.6s //tensorflow/python/ops:weak_tensor_ops_test PASSED in 77.2s //tensorflow/python/ops/losses:util_test PASSED in 8.6s //tensorflow/python/ops/memory_tests:custom_gradient_memory_test_cpu PASSED in 20.3s //tensorflow/python/ops/numpy_ops:np_array_ops_test_cpu PASSED in 90.0s //tensorflow/python/ops/numpy_ops:np_arrays_test_cpu PASSED in 15.0s //tensorflow/python/ops/numpy_ops:np_dtypes_test_cpu PASSED in 15.7s //tensorflow/python/ops/numpy_ops:np_interop_test_cpu PASSED in 64.1s //tensorflow/python/ops/numpy_ops:np_logic_test_cpu PASSED in 18.8s //tensorflow/python/ops/numpy_ops:np_math_ops_test_cpu PASSED in 37.6s //tensorflow/python/ops/numpy_ops:np_random_test_cpu PASSED in 72.0s //tensorflow/python/ops/numpy_ops:np_utils_test_cpu PASSED in 13.4s //tensorflow/python/ops/numpy_ops/integration_test:np_config_test_cpu PASSED in 25.7s //tensorflow/python/ops/numpy_ops/integration_test:public_symbol_test PASSED in 22.9s //tensorflow/python/ops/parallel_for:array_test_cpu PASSED in 49.6s //tensorflow/python/ops/parallel_for:gradients_test_cpu PASSED in 19.7s //tensorflow/python/ops/parallel_for:pfor_test PASSED in 8.8s //tensorflow/python/ops/parallel_for:xla_control_flow_ops_test_cpu PASSED in 55.0s //tensorflow/python/ops/ragged:convert_to_tensor_or_ragged_tensor_op_test PASSED in 8.8s //tensorflow/python/ops/ragged:ragged_batch_gather_op_test PASSED in 42.6s //tensorflow/python/ops/ragged:ragged_bincount_ops_test_cpu PASSED in 13.4s //tensorflow/python/ops/ragged:ragged_bitcast_op_test PASSED in 8.6s //tensorflow/python/ops/ragged:ragged_boolean_mask_op_test PASSED in 16.2s //tensorflow/python/ops/ragged:ragged_concat_op_test PASSED in 11.8s //tensorflow/python/ops/ragged:ragged_const_op_test PASSED in 9.5s //tensorflow/python/ops/ragged:ragged_constant_value_op_test PASSED in 8.7s //tensorflow/python/ops/ragged:ragged_cross_op_test PASSED in 20.4s //tensorflow/python/ops/ragged:ragged_dispatch_test PASSED in 117.2s //tensorflow/python/ops/ragged:ragged_dynamic_partition_op_test_cpu PASSED in 20.2s //tensorflow/python/ops/ragged:ragged_eager_test PASSED in 8.1s //tensorflow/python/ops/ragged:ragged_expand_dims_op_test PASSED in 8.8s //tensorflow/python/ops/ragged:ragged_factory_ops_test_cpu PASSED in 21.0s //tensorflow/python/ops/ragged:ragged_fill_empty_rows_op_test PASSED in 10.0s //tensorflow/python/ops/ragged:ragged_from_sparse_op_test PASSED in 9.8s //tensorflow/python/ops/ragged:ragged_from_tensor_op_test PASSED in 20.4s //tensorflow/python/ops/ragged:ragged_gather_nd_op_test PASSED in 11.5s //tensorflow/python/ops/ragged:ragged_map_flat_values_op_test PASSED in 11.5s //tensorflow/python/ops/ragged:ragged_map_fn_op_test PASSED in 17.2s //tensorflow/python/ops/ragged:ragged_math_ops_test PASSED in 16.6s //tensorflow/python/ops/ragged:ragged_matmul_op_test PASSED in 42.5s //tensorflow/python/ops/ragged:ragged_merge_dims_op_test PASSED in 30.5s //tensorflow/python/ops/ragged:ragged_one_hot_op_test PASSED in 12.3s //tensorflow/python/ops/ragged:ragged_operators_test PASSED in 27.8s //tensorflow/python/ops/ragged:ragged_placeholder_op_test PASSED in 10.0s //tensorflow/python/ops/ragged:ragged_print_op_test PASSED in 19.0s //tensorflow/python/ops/ragged:ragged_range_op_test PASSED in 10.4s //tensorflow/python/ops/ragged:ragged_rank_op_test PASSED in 9.2s //tensorflow/python/ops/ragged:ragged_reduce_op_test PASSED in 41.2s //tensorflow/python/ops/ragged:ragged_resize_image_op_test PASSED in 22.6s //tensorflow/python/ops/ragged:ragged_reverse_op_test PASSED in 11.5s //tensorflow/python/ops/ragged:ragged_row_lengths_op_test PASSED in 10.0s //tensorflow/python/ops/ragged:ragged_row_splits_to_segment_ids_op_test PASSED in 11.5s //tensorflow/python/ops/ragged:ragged_segment_ids_to_row_splits_op_test PASSED in 10.0s //tensorflow/python/ops/ragged:ragged_segment_op_test PASSED in 18.0s //tensorflow/python/ops/ragged:ragged_size_op_test PASSED in 11.0s //tensorflow/python/ops/ragged:ragged_split_op_test PASSED in 44.7s //tensorflow/python/ops/ragged:ragged_squeeze_op_test PASSED in 22.5s //tensorflow/python/ops/ragged:ragged_stack_op_test PASSED in 16.2s //tensorflow/python/ops/ragged:ragged_tensor_bounding_shape_op_test PASSED in 12.7s //tensorflow/python/ops/ragged:ragged_tensor_shape_test PASSED in 71.6s //tensorflow/python/ops/ragged:ragged_tile_op_test PASSED in 42.2s //tensorflow/python/ops/ragged:ragged_to_sparse_op_test PASSED in 8.8s //tensorflow/python/ops/ragged:ragged_to_tensor_op_test PASSED in 61.9s //tensorflow/python/ops/ragged:ragged_util_test PASSED in 22.0s //tensorflow/python/ops/ragged:ragged_where_op_test PASSED in 27.7s //tensorflow/python/ops/ragged:row_partition_test PASSED in 24.2s //tensorflow/python/ops/ragged:string_ngrams_op_test PASSED in 10.3s //tensorflow/python/ops/ragged:strings_reduce_join_op_test PASSED in 9.0s //tensorflow/python/ops/structured:structured_array_ops_test PASSED in 42.9s //tensorflow/python/ops/structured:structured_tensor_slice_test PASSED in 51.3s //tensorflow/python/ops/structured:structured_tensor_spec_test PASSED in 11.3s //tensorflow/python/ops/structured:structured_tensor_test PASSED in 41.5s //tensorflow/python/ops/v1_compat_tests:gradient_checker_test_cpu PASSED in 17.0s //tensorflow/python/platform:benchmark_test PASSED in 12.5s //tensorflow/python/platform:build_info_test PASSED in 10.7s //tensorflow/python/platform:resource_loader_test PASSED in 4.2s //tensorflow/python/profiler:pprof_profiler_test PASSED in 9.2s //tensorflow/python/profiler:profile_context_test_cpu PASSED in 33.8s //tensorflow/python/profiler:profiler_client_test_cpu PASSED in 12.4s //tensorflow/python/profiler:profiler_test_cpu PASSED in 24.5s //tensorflow/python/profiler:profiler_v2_test_cpu PASSED in 11.0s //tensorflow/python/profiler:profiler_wrapper_test PASSED in 8.8s //tensorflow/python/profiler:tfprof_logger_test PASSED in 9.6s //tensorflow/python/profiler/internal:flops_registry_test PASSED in 8.6s //tensorflow/python/profiler/internal:print_model_analysis_test PASSED in 8.9s //tensorflow/python/profiler/internal:run_metadata_test_cpu PASSED in 21.5s //tensorflow/python/saved_model:fingerprinting_test PASSED in 14.6s //tensorflow/python/saved_model:load_v1_in_v2_test PASSED in 24.9s //tensorflow/python/saved_model:loader_test PASSED in 16.9s //tensorflow/python/saved_model:method_name_updater_test PASSED in 12.4s //tensorflow/python/saved_model:metrics_test PASSED in 16.2s //tensorflow/python/saved_model:nested_structure_coder_test PASSED in 13.4s //tensorflow/python/saved_model:pywrap_saved_model_fingerprinting_test PASSED in 10.1s //tensorflow/python/saved_model:pywrap_saved_model_metrics_test PASSED in 13.0s //tensorflow/python/saved_model:revived_types_test PASSED in 11.6s //tensorflow/python/saved_model:save_context_test PASSED in 9.4s //tensorflow/python/saved_model:save_test PASSED in 44.6s //tensorflow/python/saved_model:saved_model_test PASSED in 32.6s //tensorflow/python/saved_model:signature_def_utils_test PASSED in 12.4s //tensorflow/python/saved_model:simple_save_test PASSED in 12.0s //tensorflow/python/saved_model:tracing_utils_test PASSED in 10.9s //tensorflow/python/saved_model:utils_test PASSED in 10.3s //tensorflow/python/saved_model/model_utils:export_output_test PASSED in 10.6s //tensorflow/python/saved_model/model_utils:export_test PASSED in 13.5s //tensorflow/python/saved_model/model_utils:mode_keys_test PASSED in 9.2s //tensorflow/python/saved_model/registration:registration_saving_test PASSED in 20.7s //tensorflow/python/saved_model/registration:registration_test PASSED in 12.6s //tensorflow/python/saved_model/registration:tf_registration_test PASSED in 26.1s //tensorflow/python/saved_model/tests:variable_wrapper_test PASSED in 12.5s //tensorflow/python/summary:plugin_asset_test PASSED in 12.0s //tensorflow/python/summary:summary_iterator_test PASSED in 11.8s //tensorflow/python/summary:summary_test PASSED in 15.3s //tensorflow/python/summary:summary_v2_test PASSED in 16.7s //tensorflow/python/summary/writer:writer_test PASSED in 26.2s //tensorflow/python/tools:aot_compiled_test PASSED in 23.1s //tensorflow/python/tools:freeze_graph_test PASSED in 10.5s //tensorflow/python/tools:optimize_for_inference_test PASSED in 9.4s //tensorflow/python/tools:print_selective_registration_header_test PASSED in 10.2s //tensorflow/python/tools:saved_model_cli_test PASSED in 18.5s //tensorflow/python/tools:saved_model_utils_test PASSED in 9.3s //tensorflow/python/tools:strip_unused_test PASSED in 8.5s //tensorflow/python/tools/api/generator:create_python_api_test PASSED in 9.5s //tensorflow/python/tools/api/generator:output_init_files_test PASSED in 19.6s //tensorflow/python/tools/api/generator:tensorflow_doc_srcs_test PASSED in 9.6s //tensorflow/python/tools/api/generator2/extractor:extractor_test PASSED in 2.7s //tensorflow/python/tools/api/generator2/generator:generator_test PASSED in 2.4s //tensorflow/python/tools/api/generator2/shared:exported_api_test PASSED in 8.5s //tensorflow/python/tpu:bfloat16_test PASSED in 9.8s //tensorflow/python/tpu:feature_column_test PASSED in 18.2s //tensorflow/python/tpu:topology_test PASSED in 10.0s //tensorflow/python/tpu:tpu_embedding_for_serving_test PASSED in 14.3s //tensorflow/python/tpu:tpu_embedding_v2_utils_test PASSED in 12.8s //tensorflow/python/tpu:tpu_embedding_v3_checkpoint_adapter_test PASSED in 9.7s //tensorflow/python/tpu:tpu_embedding_v3_utils_test PASSED in 12.0s //tensorflow/python/tpu:tpu_infeed_test PASSED in 11.3s //tensorflow/python/tpu:tpu_sharding_test PASSED in 10.5s //tensorflow/python/tpu:tpu_test_wrapper_test PASSED in 8.4s //tensorflow/python/tpu/client:client_py_test PASSED in 11.6s //tensorflow/python/trackable:autotrackable_test PASSED in 13.0s //tensorflow/python/trackable:base_delegate_test PASSED in 17.2s //tensorflow/python/trackable:base_test PASSED in 11.9s //tensorflow/python/trackable:python_state_test PASSED in 11.6s //tensorflow/python/trackable:resource_test PASSED in 11.0s //tensorflow/python/trackable:trackable_utils_test PASSED in 9.0s //tensorflow/python/training:adadelta_test_cpu PASSED in 23.0s //tensorflow/python/training:adagrad_da_test_cpu PASSED in 16.1s //tensorflow/python/training:adagrad_test_cpu PASSED in 20.1s //tensorflow/python/training:adam_test_cpu PASSED in 22.8s //tensorflow/python/training:basic_loops_test_cpu PASSED in 10.9s //tensorflow/python/training:basic_session_run_hooks_test PASSED in 44.1s //tensorflow/python/training:checkpoint_ops_test PASSED in 10.7s //tensorflow/python/training:coordinator_test_cpu PASSED in 18.2s //tensorflow/python/training:device_setter_test_cpu PASSED in 12.0s //tensorflow/python/training:ftrl_test_cpu PASSED in 24.6s //tensorflow/python/training:gradient_descent_test_cpu PASSED in 16.4s //tensorflow/python/training:input_test PASSED in 34.2s //tensorflow/python/training:momentum_test_cpu PASSED in 19.4s //tensorflow/python/training:monitored_session_test PASSED in 39.2s //tensorflow/python/training:moving_averages_test_cpu PASSED in 27.1s //tensorflow/python/training:optimizer_test_cpu PASSED in 15.9s //tensorflow/python/training:proximal_adagrad_test_cpu PASSED in 16.1s //tensorflow/python/training:proximal_gradient_descent_test_cpu PASSED in 19.9s //tensorflow/python/training:quantize_training_test_cpu PASSED in 11.3s //tensorflow/python/training:queue_runner_test_cpu PASSED in 11.9s //tensorflow/python/training:rmsprop_test_cpu PASSED in 42.3s //tensorflow/python/training:saver_large_partitioned_variable_test PASSED in 16.8s //tensorflow/python/training:saver_test_2gpu PASSED in 57.0s //tensorflow/python/training:saver_test_cpu PASSED in 53.7s //tensorflow/python/training:server_lib_multiple_containers_test PASSED in 11.8s //tensorflow/python/training:server_lib_same_variables_clear_container_test PASSED in 13.1s //tensorflow/python/training:server_lib_same_variables_clear_test PASSED in 11.1s //tensorflow/python/training:server_lib_same_variables_no_clear_test PASSED in 12.2s //tensorflow/python/training:server_lib_sparse_job_test PASSED in 11.1s //tensorflow/python/training:server_lib_test PASSED in 24.9s //tensorflow/python/training:session_manager_test_cpu PASSED in 82.7s //tensorflow/python/training:slot_creator_test_cpu PASSED in 14.9s //tensorflow/python/training:supervisor_test PASSED in 24.8s //tensorflow/python/training:training_ops_mlir_test_cpu PASSED in 12.7s //tensorflow/python/training:training_ops_test_cpu PASSED in 12.3s //tensorflow/python/training:training_util_test PASSED in 15.5s //tensorflow/python/training:warm_starting_util_test PASSED in 40.4s //tensorflow/python/training/experimental:loss_scale_optimizer_test PASSED in 17.1s //tensorflow/python/training/experimental:loss_scale_test PASSED in 24.7s //tensorflow/python/training/experimental:mixed_precision_test_cpu PASSED in 13.5s //tensorflow/python/training/saving:saveable_object_util_test PASSED in 11.5s //tensorflow/python/util:compat_test PASSED in 10.1s //tensorflow/python/util:decorator_utils_test PASSED in 10.9s //tensorflow/python/util:deprecation_test PASSED in 10.7s //tensorflow/python/util:dispatch_test PASSED in 23.8s //tensorflow/python/util:example_parser_configuration_test PASSED in 23.0s //tensorflow/python/util:fast_module_type_test PASSED in 17.4s //tensorflow/python/util:function_parameter_canonicalizer_test PASSED in 10.0s //tensorflow/python/util:function_utils_test PASSED in 10.8s //tensorflow/python/util:keyword_args_test PASSED in 10.4s //tensorflow/python/util:lazy_loader_test PASSED in 10.9s //tensorflow/python/util:lock_util_test PASSED in 11.1s //tensorflow/python/util:module_wrapper_test PASSED in 11.2s //tensorflow/python/util:nest_test PASSED in 27.8s //tensorflow/python/util:object_identity_test PASSED in 10.8s //tensorflow/python/util:pywrap_xla_ops_test PASSED in 3.2s //tensorflow/python/util:serialization_test PASSED in 10.5s //tensorflow/python/util:tf_contextlib_test PASSED in 9.8s //tensorflow/python/util:tf_decorator_test PASSED in 10.9s //tensorflow/python/util:tf_export_test PASSED in 10.3s //tensorflow/python/util:tf_inspect_test PASSED in 11.1s //tensorflow/python/util:tf_should_use_test PASSED in 11.7s //tensorflow/python/util:tf_stack_test PASSED in 10.4s //tensorflow/python/util:traceback_utils_test PASSED in 11.1s //tensorflow/python/util:type_annotations_test PASSED in 10.7s //tensorflow/python/util:variable_utils_test PASSED in 13.6s //tensorflow/python/util:vlog_test PASSED in 11.1s //tensorflow/python/util/protobuf:protobuf_compare_test PASSED in 5.0s //tensorflow/tools/api/tests:module_test PASSED in 22.9s //tensorflow/tools/benchmark:benchmark_model_test PASSED in 1.9s //tensorflow/tools/common:public_api_test PASSED in 2.5s //tensorflow/tools/common:traverse_test PASSED in 2.6s //tensorflow/tools/compatibility:all_renames_v2_test PASSED in 8.3s //tensorflow/tools/compatibility:ast_edits_test PASSED in 8.8s //tensorflow/tools/compatibility:test_file_v1_0 PASSED in 22.9s //tensorflow/tools/compatibility:test_file_v2_0 PASSED in 118.2s //tensorflow/tools/compatibility:tf_upgrade_test PASSED in 8.1s //tensorflow/tools/compatibility:tf_upgrade_v2_safety_test PASSED in 8.2s //tensorflow/tools/docs:tf_doctest_test PASSED in 5.1s //tensorflow/tools/graph_transforms:file_utils_test PASSED in 0.4s //tensorflow/tools/graph_transforms:transform_graph_test PASSED in 1.8s //tensorflow/tools/graph_transforms:transform_utils_test PASSED in 1.8s //tensorflow/tools/graph_transforms:transforms_test PASSED in 2.8s //tensorflow/tools/proto_splitter:merge_test PASSED in 0.6s //tensorflow/tools/proto_splitter:split_graph_def_test PASSED in 8.1s //tensorflow/tools/proto_splitter:split_test PASSED in 8.1s //tensorflow/tools/proto_splitter:util_test PASSED in 8.4s //tensorflow/tools/proto_splitter/cc:composable_splitter_test PASSED in 0.2s //tensorflow/tools/proto_splitter/cc:graph_def_splitter_test PASSED in 0.4s //tensorflow/tools/proto_splitter/cc:saved_model_splitter_test PASSED in 0.4s //tensorflow/tools/proto_splitter/cc:util_test PASSED in 2.2s //tensorflow/tools/proto_splitter/python:saved_model_test PASSED in 8.7s //tensorflow/tools/proto_splitter/python:test_util_test PASSED in 8.5s //tensorflow/tools/proto_text:gen_proto_text_functions_lib_test PASSED in 0.1s //tensorflow/tools/tensorflow_builder/compat_checker:compat_checker_test PASSED in 0.4s //tensorflow/compiler/tests:complex_div_test_cpu PASSED in 78.6s Stats over 2 runs: max = 78.6s, min = 51.2s, avg = 64.9s, dev = 13.7s //tensorflow/compiler/tests:complex_div_test_cpu_mlir_bridge_test PASSED in 76.6s Stats over 2 runs: max = 76.6s, min = 75.3s, avg = 75.9s, dev = 0.6s //tensorflow/python/data/experimental/kernel_tests/optimization:optimization_test PASSED in 28.9s Stats over 2 runs: max = 28.9s, min = 11.7s, avg = 20.3s, dev = 8.6s //tensorflow/python/data/experimental/kernel_tests/service:metadata_test PASSED in 20.9s Stats over 2 runs: max = 20.9s, min = 10.1s, avg = 15.5s, dev = 5.4s //tensorflow/python/data/kernel_tests:padded_batch_test PASSED in 38.0s Stats over 2 runs: max = 38.0s, min = 30.8s, avg = 34.4s, dev = 3.6s //tensorflow/python/data/kernel_tests:repeat_test PASSED in 167.6s Stats over 2 runs: max = 167.6s, min = 147.3s, avg = 157.4s, dev = 10.1s //tensorflow/python/data/kernel_tests:window_test PASSED in 43.1s Stats over 2 runs: max = 43.1s, min = 34.3s, avg = 38.7s, dev = 4.4s //tensorflow/python/kernel_tests/array_ops:scatter_nd_ops_test_cpu PASSED in 16.0s Stats over 2 runs: max = 16.0s, min = 15.2s, avg = 15.6s, dev = 0.4s //tensorflow/python/kernel_tests/control_flow:functional_ops_test_cpu PASSED in 19.4s Stats over 2 runs: max = 19.4s, min = 14.8s, avg = 17.1s, dev = 2.3s //tensorflow/python/kernel_tests/control_flow:map_fn_test_cpu PASSED in 11.6s Stats over 2 runs: max = 11.6s, min = 8.0s, avg = 9.8s, dev = 1.8s //tensorflow/python/kernel_tests/nn_ops:atrous_conv2d_test_cpu PASSED in 42.2s Stats over 2 runs: max = 42.2s, min = 27.2s, avg = 34.7s, dev = 7.5s //tensorflow/python/kernel_tests/nn_ops:bias_op_d9m_test_cpu PASSED in 155.0s Stats over 2 runs: max = 155.0s, min = 55.9s, avg = 105.4s, dev = 49.6s //tensorflow/python/kernel_tests/nn_ops:conv2d_backprop_filter_grad_test_cpu PASSED in 10.8s Stats over 2 runs: max = 10.8s, min = 4.9s, avg = 7.9s, dev = 2.9s //tensorflow/python/kernel_tests/signal:fft_ops_test_cpu PASSED in 132.4s Stats over 2 runs: max = 132.4s, min = 130.6s, avg = 131.5s, dev = 0.9s //tensorflow/python/ops:control_flow_ops_test_cpu PASSED in 38.6s Stats over 2 runs: max = 38.6s, min = 36.0s, avg = 37.3s, dev = 1.3s //tensorflow/compiler/tests:spacetobatch_op_test_cpu PASSED in 12.1s Stats over 3 runs: max = 12.1s, min = 5.9s, avg = 8.1s, dev = 2.9s //tensorflow/compiler/tests:spacetobatch_op_test_cpu_mlir_bridge_test PASSED in 13.4s Stats over 3 runs: max = 13.4s, min = 11.3s, avg = 12.2s, dev = 0.9s //tensorflow/core/data/service:thread_safe_buffer_test PASSED in 0.8s Stats over 3 runs: max = 0.8s, min = 0.1s, avg = 0.4s, dev = 0.3s //tensorflow/python/data/experimental/kernel_tests/service:multi_process_cluster_test PASSED in 25.6s Stats over 3 runs: max = 25.6s, min = 7.5s, avg = 15.3s, dev = 7.6s //tensorflow/python/data/kernel_tests:unique_test PASSED in 22.4s Stats over 3 runs: max = 22.4s, min = 9.0s, avg = 15.9s, dev = 5.5s //tensorflow/python/distribute/coordinator:metric_utils_test PASSED in 32.9s Stats over 3 runs: max = 32.9s, min = 12.1s, avg = 21.3s, dev = 8.7s //tensorflow/python/kernel_tests/array_ops:gather_op_test_cpu PASSED in 62.4s Stats over 3 runs: max = 62.4s, min = 24.7s, avg = 38.1s, dev = 17.2s //tensorflow/python/kernel_tests/array_ops:weights_broadcast_test PASSED in 13.8s Stats over 3 runs: max = 13.8s, min = 7.1s, avg = 9.6s, dev = 3.0s //tensorflow/python/kernel_tests/distributions:util_test_cpu PASSED in 18.9s Stats over 3 runs: max = 18.9s, min = 10.2s, avg = 13.1s, dev = 4.1s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_grad_test_cpu PASSED in 10.1s Stats over 3 runs: max = 10.1s, min = 4.7s, avg = 7.2s, dev = 2.2s //tensorflow/python/kernel_tests/random:multinomial_op_big_test_cpu PASSED in 22.9s Stats over 3 runs: max = 22.9s, min = 8.1s, avg = 13.3s, dev = 6.8s //tensorflow/python/eager:small_constants_optimizer_test_cpu FLAKY, failed in 2 out of 3 in 276.1s Stats over 3 runs: max = 276.1s, min = 159.0s, avg = 218.3s, dev = 47.8s /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/testlogs/tensorflow/python/eager/small_constants_optimizer_test_cpu/test_attempts/attempt_1.log /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/testlogs/tensorflow/python/eager/small_constants_optimizer_test_cpu/test_attempts/attempt_2.log //tensorflow/core/kernels:example_parsing_ops_test PASSED in 0.5s Stats over 4 runs: max = 0.5s, min = 0.4s, avg = 0.4s, dev = 0.0s //tensorflow/dtensor/python/tests:batchparallel_spmd_test_cpu PASSED in 18.7s Stats over 4 runs: max = 18.7s, min = 13.1s, avg = 15.0s, dev = 2.2s //tensorflow/dtensor/python/tests:conv_test_cpu PASSED in 14.2s Stats over 4 runs: max = 14.2s, min = 8.5s, avg = 10.5s, dev = 2.2s //tensorflow/dtensor/python/tests:sparse_test_cpu PASSED in 14.5s Stats over 4 runs: max = 14.5s, min = 8.1s, avg = 10.4s, dev = 2.5s //tensorflow/python/data/experimental/kernel_tests:auto_shard_dataset_test PASSED in 46.9s Stats over 4 runs: max = 46.9s, min = 19.5s, avg = 35.4s, dev = 10.3s //tensorflow/python/data/experimental/kernel_tests:from_list_test PASSED in 62.1s Stats over 4 runs: max = 62.1s, min = 44.7s, avg = 50.7s, dev = 6.8s //tensorflow/python/data/experimental/kernel_tests:map_and_batch_test PASSED in 65.2s Stats over 4 runs: max = 65.2s, min = 33.1s, avg = 46.8s, dev = 11.8s //tensorflow/python/data/experimental/kernel_tests:parse_example_dataset_test PASSED in 35.5s Stats over 4 runs: max = 35.5s, min = 17.3s, avg = 26.7s, dev = 7.9s //tensorflow/python/data/experimental/kernel_tests:rebatch_dataset_test PASSED in 24.0s Stats over 4 runs: max = 24.0s, min = 9.7s, avg = 15.7s, dev = 5.3s //tensorflow/python/data/experimental/kernel_tests:sql_dataset_test PASSED in 77.2s Stats over 4 runs: max = 77.2s, min = 34.3s, avg = 51.8s, dev = 15.6s //tensorflow/python/data/experimental/kernel_tests/service:cross_trainer_cache_ft_test PASSED in 12.9s Stats over 4 runs: max = 12.9s, min = 5.2s, avg = 8.5s, dev = 2.8s //tensorflow/python/data/kernel_tests:fixed_length_record_dataset_test PASSED in 16.9s Stats over 4 runs: max = 16.9s, min = 6.4s, avg = 12.0s, dev = 3.7s //tensorflow/python/data/kernel_tests:from_generator_test PASSED in 33.5s Stats over 4 runs: max = 33.5s, min = 11.3s, avg = 19.9s, dev = 8.4s //tensorflow/python/data/kernel_tests:from_tensor_slices_test PASSED in 38.7s Stats over 4 runs: max = 38.7s, min = 24.6s, avg = 28.6s, dev = 5.8s //tensorflow/python/data/kernel_tests:from_tensors_test PASSED in 53.3s Stats over 4 runs: max = 53.3s, min = 37.0s, avg = 44.8s, dev = 7.8s //tensorflow/python/data/kernel_tests:group_by_window_test PASSED in 24.6s Stats over 4 runs: max = 24.6s, min = 14.4s, avg = 18.5s, dev = 4.0s //tensorflow/python/data/kernel_tests:list_files_test PASSED in 62.1s Stats over 4 runs: max = 62.1s, min = 41.3s, avg = 51.7s, dev = 9.2s //tensorflow/python/data/kernel_tests:ragged_batch_test PASSED in 29.7s Stats over 4 runs: max = 29.7s, min = 22.8s, avg = 24.9s, dev = 2.8s //tensorflow/python/data/kernel_tests:take_test PASSED in 96.0s Stats over 4 runs: max = 96.0s, min = 76.4s, avg = 84.4s, dev = 7.8s //tensorflow/python/data/kernel_tests:take_while_test PASSED in 51.6s Stats over 4 runs: max = 51.6s, min = 37.7s, avg = 43.5s, dev = 5.4s //tensorflow/python/data/kernel_tests:text_line_dataset_test PASSED in 58.6s Stats over 4 runs: max = 58.6s, min = 28.9s, avg = 40.1s, dev = 11.8s //tensorflow/python/data/kernel_tests:zip_test PASSED in 22.2s Stats over 4 runs: max = 22.2s, min = 9.8s, avg = 13.1s, dev = 5.3s //tensorflow/python/debug/lib:dumping_callback_test_cpu PASSED in 21.6s Stats over 4 runs: max = 21.6s, min = 13.2s, avg = 16.9s, dev = 3.2s //tensorflow/python/distribute:cross_device_ops_test_cpu PASSED in 34.9s Stats over 4 runs: max = 34.9s, min = 20.1s, avg = 26.5s, dev = 6.0s //tensorflow/python/framework:convert_to_constants_test PASSED in 27.0s Stats over 4 runs: max = 27.0s, min = 19.2s, avg = 23.8s, dev = 3.0s //tensorflow/python/kernel_tests:collective_ops_test_cpu PASSED in 46.5s Stats over 4 runs: max = 46.5s, min = 31.7s, avg = 38.3s, dev = 5.3s //tensorflow/python/kernel_tests/array_ops:concat_op_test_cpu PASSED in 17.0s Stats over 4 runs: max = 17.0s, min = 7.4s, avg = 13.0s, dev = 3.9s //tensorflow/python/kernel_tests/array_ops:init_ops_test_cpu PASSED in 70.2s Stats over 4 runs: max = 70.2s, min = 12.0s, avg = 37.5s, dev = 23.7s //tensorflow/python/kernel_tests/array_ops:split_op_test_cpu PASSED in 38.3s Stats over 4 runs: max = 38.3s, min = 12.8s, avg = 25.2s, dev = 12.2s //tensorflow/python/kernel_tests/linalg:einsum_op_test_cpu PASSED in 141.1s Stats over 4 runs: max = 141.1s, min = 17.4s, avg = 66.7s, dev = 49.0s //tensorflow/python/kernel_tests/linalg:linear_operator_lower_triangular_test_cpu PASSED in 56.8s Stats over 4 runs: max = 56.8s, min = 44.2s, avg = 50.8s, dev = 5.3s //tensorflow/python/kernel_tests/nn_ops:conv_ops_test_cpu PASSED in 45.8s Stats over 4 runs: max = 45.8s, min = 26.7s, avg = 31.6s, dev = 8.2s //tensorflow/python/kernel_tests/random:random_gamma_test_cpu PASSED in 133.9s Stats over 4 runs: max = 133.9s, min = 7.5s, avg = 62.8s, dev = 56.2s //tensorflow/python/kernel_tests/signal:window_ops_test_cpu PASSED in 31.6s Stats over 4 runs: max = 31.6s, min = 16.7s, avg = 24.5s, dev = 5.4s //tensorflow/python/ops:nn_batchnorm_test_cpu PASSED in 28.6s Stats over 4 runs: max = 28.6s, min = 15.5s, avg = 19.6s, dev = 5.3s //tensorflow/python/ops:nn_fused_batchnorm_d9m_test_cpu PASSED in 27.3s Stats over 4 runs: max = 27.3s, min = 17.1s, avg = 21.3s, dev = 3.9s //tensorflow/python/ops/ragged:ragged_gather_op_test PASSED in 70.7s Stats over 4 runs: max = 70.7s, min = 19.9s, avg = 41.4s, dev = 18.6s //tensorflow/python/ops/ragged:ragged_getitem_test PASSED in 46.4s Stats over 4 runs: max = 46.4s, min = 38.1s, avg = 43.4s, dev = 3.2s //tensorflow/python/kernel_tests/linalg:matrix_triangular_solve_op_test_cpu FLAKY, failed in 1 out of 4 in 900.3s Stats over 4 runs: max = 900.3s, min = 7.9s, avg = 247.3s, dev = 377.6s /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/testlogs/tensorflow/python/kernel_tests/linalg/matrix_triangular_solve_op_test_cpu/shard_1_of_3/test_attempts/attempt_1.log //tensorflow/compiler/tests:conv3d_test_cpu PASSED in 38.5s Stats over 5 runs: max = 38.5s, min = 6.5s, avg = 27.3s, dev = 11.8s //tensorflow/compiler/tests:conv3d_test_cpu_mlir_bridge_test PASSED in 45.1s Stats over 5 runs: max = 45.1s, min = 6.6s, avg = 16.0s, dev = 14.8s //tensorflow/compiler/tests:depthwise_conv_op_test_cpu PASSED in 15.9s Stats over 5 runs: max = 15.9s, min = 4.1s, avg = 7.9s, dev = 4.2s //tensorflow/compiler/tests:depthwise_conv_op_test_cpu_mlir_bridge_test PASSED in 16.8s Stats over 5 runs: max = 16.8s, min = 5.9s, avg = 10.3s, dev = 3.8s //tensorflow/compiler/tests:fused_batchnorm_test_cpu PASSED in 10.3s Stats over 5 runs: max = 10.3s, min = 4.3s, avg = 5.6s, dev = 2.4s //tensorflow/compiler/tests:fused_batchnorm_test_cpu_mlir_bridge_test PASSED in 11.2s Stats over 5 runs: max = 11.2s, min = 6.4s, avg = 8.3s, dev = 2.2s //tensorflow/compiler/tests:reduce_ops_test_cpu PASSED in 12.6s Stats over 5 runs: max = 12.6s, min = 6.0s, avg = 7.8s, dev = 2.4s //tensorflow/compiler/tests:reduce_ops_test_cpu_mlir_bridge_test PASSED in 13.4s Stats over 5 runs: max = 13.4s, min = 5.9s, avg = 8.0s, dev = 2.8s //tensorflow/compiler/tests:special_math_test_cpu PASSED in 165.3s Stats over 5 runs: max = 165.3s, min = 22.1s, avg = 71.4s, dev = 51.0s //tensorflow/compiler/tests:special_math_test_cpu_mlir_bridge_test PASSED in 148.4s Stats over 5 runs: max = 148.4s, min = 23.5s, avg = 69.9s, dev = 44.1s //tensorflow/core/grappler/optimizers:constant_folding_test PASSED in 2.8s Stats over 5 runs: max = 2.8s, min = 2.2s, avg = 2.5s, dev = 0.3s //tensorflow/dtensor/python/tests:layout_propagation_test_cpu PASSED in 14.4s Stats over 5 runs: max = 14.4s, min = 6.4s, avg = 8.9s, dev = 2.8s //tensorflow/dtensor/python/tests:multi_mesh_test_cpu PASSED in 58.1s Stats over 5 runs: max = 58.1s, min = 5.2s, avg = 23.9s, dev = 19.9s //tensorflow/python/distribute:mirrored_strategy_test_2gpu PASSED in 20.4s Stats over 5 runs: max = 20.4s, min = 10.6s, avg = 12.9s, dev = 3.8s //tensorflow/python/distribute:mirrored_strategy_test_cpu PASSED in 20.2s Stats over 5 runs: max = 20.2s, min = 8.8s, avg = 12.8s, dev = 4.0s //tensorflow/python/eager:device_placement_test_cpu PASSED in 11.1s Stats over 5 runs: max = 11.1s, min = 5.4s, avg = 7.5s, dev = 2.0s //tensorflow/python/eager:forwardprop_test_cpu PASSED in 234.5s Stats over 5 runs: max = 234.5s, min = 15.6s, avg = 81.6s, dev = 78.2s //tensorflow/python/eager/polymorphic_function:gradients_test_cpu PASSED in 24.5s Stats over 5 runs: max = 24.5s, min = 10.3s, avg = 15.5s, dev = 5.2s //tensorflow/python/grappler:cluster_test_cpu PASSED in 10.1s Stats over 5 runs: max = 10.1s, min = 4.6s, avg = 6.5s, dev = 1.9s //tensorflow/python/kernel_tests/linalg:cholesky_op_test_cpu PASSED in 96.1s Stats over 5 runs: max = 96.1s, min = 39.0s, avg = 63.7s, dev = 19.8s //tensorflow/python/kernel_tests/linalg:linear_operator_adjoint_test_cpu PASSED in 63.1s Stats over 5 runs: max = 63.1s, min = 57.7s, avg = 61.3s, dev = 2.1s //tensorflow/python/kernel_tests/linalg:linear_operator_composition_test_cpu PASSED in 155.6s Stats over 5 runs: max = 155.6s, min = 126.0s, avg = 140.1s, dev = 9.7s //tensorflow/python/kernel_tests/linalg:linear_operator_diag_test_cpu PASSED in 64.7s Stats over 5 runs: max = 64.7s, min = 51.2s, avg = 58.4s, dev = 4.6s //tensorflow/python/kernel_tests/linalg:linear_operator_full_matrix_test_cpu PASSED in 97.7s Stats over 5 runs: max = 97.7s, min = 72.4s, avg = 81.9s, dev = 8.6s //tensorflow/python/kernel_tests/linalg:linear_operator_householder_test_cpu PASSED in 69.7s Stats over 5 runs: max = 69.7s, min = 52.0s, avg = 62.5s, dev = 5.9s //tensorflow/python/kernel_tests/linalg:linear_operator_identity_test_cpu PASSED in 75.3s Stats over 5 runs: max = 75.3s, min = 57.6s, avg = 67.0s, dev = 7.2s //tensorflow/python/kernel_tests/linalg:linear_operator_inversion_test_cpu PASSED in 56.2s Stats over 5 runs: max = 56.2s, min = 36.0s, avg = 46.6s, dev = 8.4s //tensorflow/python/kernel_tests/linalg:linear_operator_permutation_test_cpu PASSED in 47.6s Stats over 5 runs: max = 47.6s, min = 31.6s, avg = 38.2s, dev = 5.4s //tensorflow/python/kernel_tests/linalg:linear_operator_toeplitz_test_cpu PASSED in 60.2s Stats over 5 runs: max = 60.2s, min = 43.4s, avg = 49.8s, dev = 6.3s //tensorflow/python/kernel_tests/linalg:linear_operator_util_test_cpu PASSED in 14.9s Stats over 5 runs: max = 14.9s, min = 4.8s, avg = 7.0s, dev = 3.9s //tensorflow/python/kernel_tests/linalg:linear_operator_zeros_test_cpu PASSED in 43.7s Stats over 5 runs: max = 43.7s, min = 37.4s, avg = 39.8s, dev = 2.2s //tensorflow/python/kernel_tests/linalg:tridiagonal_matmul_op_test_cpu PASSED in 166.8s Stats over 5 runs: max = 166.8s, min = 4.1s, avg = 38.8s, dev = 64.1s //tensorflow/python/kernel_tests/nn_ops:fractional_avg_pool_op_test PASSED in 16.2s Stats over 5 runs: max = 16.2s, min = 6.1s, avg = 9.6s, dev = 3.9s //tensorflow/python/kernel_tests/nn_ops:fractional_max_pool_op_test PASSED in 15.9s Stats over 5 runs: max = 15.9s, min = 5.1s, avg = 9.1s, dev = 4.0s //tensorflow/python/kernel_tests/sparse_ops:sparse_ops_test_cpu PASSED in 43.8s Stats over 5 runs: max = 43.8s, min = 5.2s, avg = 16.2s, dev = 14.3s //tensorflow/python/ops/parallel_for:math_test_cpu PASSED in 80.3s Stats over 5 runs: max = 80.3s, min = 29.9s, avg = 52.8s, dev = 16.6s //tensorflow/compiler/tests:scan_ops_test_cpu PASSED in 21.2s Stats over 6 runs: max = 21.2s, min = 12.3s, avg = 16.3s, dev = 2.7s //tensorflow/compiler/tests:scan_ops_test_cpu_mlir_bridge_test PASSED in 13.8s Stats over 6 runs: max = 13.8s, min = 7.0s, avg = 9.2s, dev = 2.2s //tensorflow/python/data/experimental/kernel_tests:make_batched_features_dataset_test PASSED in 34.4s Stats over 6 runs: max = 34.4s, min = 5.2s, avg = 17.9s, dev = 12.0s //tensorflow/python/kernel_tests/array_ops:diag_op_test_cpu PASSED in 61.6s Stats over 6 runs: max = 61.6s, min = 5.3s, avg = 18.0s, dev = 19.6s //tensorflow/python/kernel_tests/math_ops:reduction_ops_test_cpu PASSED in 64.2s Stats over 6 runs: max = 64.2s, min = 26.8s, avg = 44.7s, dev = 12.5s //tensorflow/python/distribute/experimental/rpc:rpc_ops_test PASSED in 14.4s Stats over 7 runs: max = 14.4s, min = 7.3s, avg = 9.7s, dev = 2.5s //tensorflow/compiler/tests:ftrl_test_cpu PASSED in 10.6s Stats over 8 runs: max = 10.6s, min = 4.2s, avg = 5.2s, dev = 2.0s //tensorflow/compiler/tests:matrix_diag_ops_test_cpu PASSED in 48.1s Stats over 8 runs: max = 48.1s, min = 3.5s, avg = 19.1s, dev = 16.0s //tensorflow/compiler/tests:matrix_diag_ops_test_cpu_mlir_bridge_test PASSED in 90.9s Stats over 8 runs: max = 90.9s, min = 4.0s, avg = 32.5s, dev = 29.5s //tensorflow/compiler/tests:ternary_ops_test_cpu PASSED in 21.6s Stats over 8 runs: max = 21.6s, min = 5.6s, avg = 12.4s, dev = 5.7s //tensorflow/compiler/tests:ternary_ops_test_cpu_mlir_bridge_test PASSED in 19.1s Stats over 8 runs: max = 19.1s, min = 6.1s, avg = 11.4s, dev = 4.4s //tensorflow/dtensor/python/tests:input_util_test PASSED in 35.4s Stats over 8 runs: max = 35.4s, min = 13.5s, avg = 19.9s, dev = 6.8s //tensorflow/dtensor/python/tests:save_restore_v2_test_cpu PASSED in 15.9s Stats over 8 runs: max = 15.9s, min = 5.8s, avg = 9.6s, dev = 3.8s //tensorflow/python/data/experimental/kernel_tests:csv_dataset_test PASSED in 42.9s Stats over 8 runs: max = 42.9s, min = 9.3s, avg = 21.7s, dev = 13.4s //tensorflow/python/data/experimental/kernel_tests:global_shuffle_test PASSED in 37.1s Stats over 8 runs: max = 37.1s, min = 26.2s, avg = 30.9s, dev = 3.1s //tensorflow/python/data/experimental/kernel_tests:parallel_interleave_test PASSED in 41.1s Stats over 8 runs: max = 41.1s, min = 12.3s, avg = 28.1s, dev = 10.3s //tensorflow/python/data/experimental/kernel_tests/service:coordinated_read_ft_test PASSED in 37.9s Stats over 8 runs: max = 37.9s, min = 5.1s, avg = 21.6s, dev = 12.8s //tensorflow/python/data/experimental/kernel_tests/service:coordinated_read_test PASSED in 40.1s Stats over 8 runs: max = 40.1s, min = 7.4s, avg = 16.8s, dev = 12.1s //tensorflow/python/data/experimental/kernel_tests/service:cross_trainer_cache_test PASSED in 28.4s Stats over 8 runs: max = 28.4s, min = 5.2s, avg = 13.5s, dev = 8.0s //tensorflow/python/data/experimental/kernel_tests/service:distributed_save_load_ft_test PASSED in 94.3s Stats over 8 runs: max = 94.3s, min = 14.9s, avg = 30.9s, dev = 24.9s //tensorflow/python/data/experimental/kernel_tests/service:distributed_save_load_test PASSED in 65.6s Stats over 8 runs: max = 65.6s, min = 44.5s, avg = 53.1s, dev = 6.5s //tensorflow/python/data/experimental/kernel_tests/service:distributed_save_test PASSED in 45.2s Stats over 8 runs: max = 45.2s, min = 11.5s, avg = 23.5s, dev = 11.8s //tensorflow/python/data/experimental/kernel_tests/service:fault_tolerance_test PASSED in 16.8s Stats over 8 runs: max = 16.8s, min = 5.1s, avg = 8.8s, dev = 4.0s //tensorflow/python/data/kernel_tests:batch_test PASSED in 40.8s Stats over 8 runs: max = 40.8s, min = 18.7s, avg = 27.4s, dev = 8.3s //tensorflow/python/data/kernel_tests:filter_test PASSED in 28.0s Stats over 8 runs: max = 28.0s, min = 13.1s, avg = 18.2s, dev = 5.2s //tensorflow/python/data/kernel_tests:flat_map_test PASSED in 31.5s Stats over 8 runs: max = 31.5s, min = 14.3s, avg = 19.9s, dev = 5.3s //tensorflow/python/data/kernel_tests:shard_test PASSED in 72.1s Stats over 8 runs: max = 72.1s, min = 43.6s, avg = 56.3s, dev = 10.2s //tensorflow/python/data/kernel_tests:shuffle_test PASSED in 79.1s Stats over 8 runs: max = 79.1s, min = 44.2s, avg = 54.9s, dev = 10.1s //tensorflow/python/data/kernel_tests:skip_test PASSED in 64.5s Stats over 8 runs: max = 64.5s, min = 35.2s, avg = 46.8s, dev = 7.9s //tensorflow/python/data/kernel_tests:tf_record_dataset_test PASSED in 34.9s Stats over 8 runs: max = 34.9s, min = 14.0s, avg = 25.2s, dev = 6.0s //tensorflow/python/distribute/failure_handling:failure_handler_test PASSED in 90.3s Stats over 8 runs: max = 90.3s, min = 32.9s, avg = 60.4s, dev = 19.4s //tensorflow/python/distribute/failure_handling:gce_failure_handler_test PASSED in 120.2s Stats over 8 runs: max = 120.2s, min = 9.8s, avg = 45.0s, dev = 42.8s //tensorflow/python/kernel_tests/linalg:linalg_ops_test_cpu PASSED in 69.0s Stats over 8 runs: max = 69.0s, min = 39.7s, avg = 55.4s, dev = 9.3s //tensorflow/python/kernel_tests/linalg:linear_operator_block_diag_test_cpu PASSED in 186.6s Stats over 8 runs: max = 186.6s, min = 137.4s, avg = 162.4s, dev = 14.6s //tensorflow/python/kernel_tests/linalg:linear_operator_block_lower_triangular_test_cpu PASSED in 124.8s Stats over 8 runs: max = 124.8s, min = 88.3s, avg = 105.9s, dev = 13.5s //tensorflow/python/kernel_tests/nn_ops:depthwise_conv_op_d9m_test_cpu PASSED in 84.9s Stats over 8 runs: max = 84.9s, min = 3.7s, avg = 19.1s, dev = 26.8s //tensorflow/python/kernel_tests/nn_ops:depthwise_conv_op_test_cpu PASSED in 12.7s Stats over 8 runs: max = 12.7s, min = 3.6s, avg = 5.0s, dev = 2.9s //tensorflow/python/ops/ragged:dynamic_ragged_shape_test PASSED in 49.1s Stats over 8 runs: max = 49.1s, min = 24.7s, avg = 33.4s, dev = 7.8s //tensorflow/python/ops/ragged:ragged_tensor_test PASSED in 21.3s Stats over 8 runs: max = 21.3s, min = 8.2s, avg = 13.4s, dev = 4.7s //tensorflow/compiler/tests:conv2d_test_cpu PASSED in 58.5s Stats over 10 runs: max = 58.5s, min = 5.5s, avg = 11.2s, dev = 15.8s //tensorflow/compiler/tests:conv2d_test_cpu_mlir_bridge_test PASSED in 54.1s Stats over 10 runs: max = 54.1s, min = 5.6s, avg = 11.0s, dev = 14.4s //tensorflow/compiler/tests:random_ops_test_cpu PASSED in 11.6s Stats over 10 runs: max = 11.6s, min = 5.1s, avg = 8.4s, dev = 2.0s //tensorflow/compiler/tests:random_ops_test_cpu_mlir_bridge_test PASSED in 10.3s Stats over 10 runs: max = 10.3s, min = 3.6s, avg = 7.1s, dev = 1.9s //tensorflow/compiler/tests:stateful_random_ops_test_cpu PASSED in 13.1s Stats over 10 runs: max = 13.1s, min = 8.4s, avg = 9.6s, dev = 1.3s //tensorflow/compiler/tests:stateful_random_ops_test_cpu_mlir_bridge_test PASSED in 28.9s Stats over 10 runs: max = 28.9s, min = 8.9s, avg = 13.2s, dev = 6.4s //tensorflow/compiler/tests:stateless_random_ops_test_cpu PASSED in 86.7s Stats over 10 runs: max = 86.7s, min = 38.5s, avg = 56.7s, dev = 15.5s //tensorflow/compiler/tests:stateless_random_ops_test_cpu_mlir_bridge_test PASSED in 77.9s Stats over 10 runs: max = 77.9s, min = 34.9s, avg = 51.0s, dev = 12.8s //tensorflow/python/data/kernel_tests:rejection_resample_test PASSED in 29.9s Stats over 10 runs: max = 29.9s, min = 5.6s, avg = 15.0s, dev = 8.0s //tensorflow/python/distribute:input_lib_type_spec_test_2gpu PASSED in 22.6s Stats over 10 runs: max = 22.6s, min = 7.0s, avg = 14.8s, dev = 4.9s //tensorflow/python/distribute:input_lib_type_spec_test_cpu PASSED in 22.8s Stats over 10 runs: max = 22.8s, min = 6.8s, avg = 15.0s, dev = 5.2s //tensorflow/python/framework:function_test_cpu PASSED in 61.4s Stats over 10 runs: max = 61.4s, min = 5.6s, avg = 14.6s, dev = 16.2s //tensorflow/python/kernel_tests/array_ops:array_ops_test_cpu PASSED in 19.0s Stats over 10 runs: max = 19.0s, min = 7.2s, avg = 12.2s, dev = 3.6s //tensorflow/python/kernel_tests/array_ops:inplace_ops_test_cpu PASSED in 14.1s Stats over 10 runs: max = 14.1s, min = 3.7s, avg = 5.5s, dev = 3.1s //tensorflow/python/kernel_tests/data_structures:tensor_array_ops_test_cpu PASSED in 15.2s Stats over 10 runs: max = 15.2s, min = 6.2s, avg = 9.2s, dev = 2.8s //tensorflow/python/kernel_tests/linalg:linear_operator_tridiag_test_cpu PASSED in 113.3s Stats over 10 runs: max = 113.3s, min = 89.2s, avg = 98.7s, dev = 7.8s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_ops_test_cpu PASSED in 83.0s Stats over 10 runs: max = 83.0s, min = 9.4s, avg = 45.0s, dev = 23.5s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_sparse_mat_mul_grad_test_cpu PASSED in 9.3s Stats over 10 runs: max = 9.3s, min = 3.7s, avg = 5.6s, dev = 1.7s //tensorflow/python/kernel_tests/math_ops:cwise_ops_unary_test_cpu PASSED in 18.7s Stats over 10 runs: max = 18.7s, min = 6.6s, avg = 11.4s, dev = 3.1s //tensorflow/python/kernel_tests/math_ops:segment_reduction_ops_test_cpu PASSED in 33.7s Stats over 10 runs: max = 33.7s, min = 4.9s, avg = 16.6s, dev = 10.0s //tensorflow/python/kernel_tests/nn_ops:pooling_ops_test_cpu PASSED in 42.3s Stats over 10 runs: max = 42.3s, min = 5.4s, avg = 14.9s, dev = 13.1s //tensorflow/python/kernel_tests/nn_ops:rnn_test_cpu PASSED in 15.5s Stats over 10 runs: max = 15.5s, min = 6.5s, avg = 8.6s, dev = 2.7s //tensorflow/python/kernel_tests/random:random_index_shuffle_test PASSED in 11.9s Stats over 10 runs: max = 11.9s, min = 5.5s, avg = 7.5s, dev = 1.8s //tensorflow/python/kernel_tests/random:stateless_random_ops_test_cpu PASSED in 171.1s Stats over 10 runs: max = 171.1s, min = 27.9s, avg = 100.0s, dev = 63.4s //tensorflow/python/ops:special_math_ops_test_cpu PASSED in 56.6s Stats over 10 runs: max = 56.6s, min = 6.4s, avg = 15.0s, dev = 14.2s //tensorflow/python/ops:weak_tensor_special_math_ops_test_cpu PASSED in 18.2s Stats over 10 runs: max = 18.2s, min = 5.5s, avg = 10.7s, dev = 4.7s //tensorflow/python/ops/numpy_ops/tests:np_indexing_test PASSED in 116.5s Stats over 10 runs: max = 116.5s, min = 88.8s, avg = 95.7s, dev = 7.7s //tensorflow/python/ops/ragged:ragged_tensor_supported_values_test PASSED in 18.9s Stats over 10 runs: max = 18.9s, min = 12.8s, avg = 14.5s, dev = 2.1s //tensorflow/python/saved_model:load_test_cpu PASSED in 67.6s Stats over 10 runs: max = 67.6s, min = 32.4s, avg = 42.5s, dev = 10.0s //tensorflow/compiler/tests:fft_test_cpu PASSED in 33.1s Stats over 12 runs: max = 33.1s, min = 6.3s, avg = 12.9s, dev = 7.2s //tensorflow/python/data/experimental/kernel_tests:group_by_reducer_test PASSED in 27.3s Stats over 12 runs: max = 27.3s, min = 5.2s, avg = 12.4s, dev = 6.5s //tensorflow/python/data/kernel_tests:choose_from_datasets_test PASSED in 21.8s Stats over 12 runs: max = 21.8s, min = 4.5s, avg = 13.3s, dev = 6.6s //tensorflow/python/data/kernel_tests:memory_cleanup_test_cpu PASSED in 11.0s Stats over 12 runs: max = 11.0s, min = 5.5s, avg = 7.8s, dev = 1.7s //tensorflow/python/distribute:moving_averages_test_2gpu PASSED in 22.5s Stats over 12 runs: max = 22.5s, min = 13.4s, avg = 16.2s, dev = 2.5s //tensorflow/python/distribute:moving_averages_test_cpu PASSED in 22.5s Stats over 12 runs: max = 22.5s, min = 11.9s, avg = 16.2s, dev = 2.9s //tensorflow/python/eager/polymorphic_function:polymorphic_function_test_cpu PASSED in 37.9s Stats over 15 runs: max = 37.9s, min = 13.6s, avg = 18.7s, dev = 5.6s //tensorflow/python/kernel_tests/linalg:linear_operator_low_rank_update_test_cpu PASSED in 112.8s Stats over 15 runs: max = 112.8s, min = 99.4s, avg = 103.5s, dev = 3.3s //tensorflow/python/kernel_tests/nn_ops:rnn_cell_test_cpu PASSED in 64.0s Stats over 15 runs: max = 64.0s, min = 6.2s, avg = 16.8s, dev = 16.1s //tensorflow/python/data/experimental/kernel_tests/service:dynamic_sharding_test PASSED in 13.4s Stats over 16 runs: max = 13.4s, min = 4.4s, avg = 8.0s, dev = 2.6s //tensorflow/python/data/kernel_tests:snapshot_test PASSED in 48.7s Stats over 16 runs: max = 48.7s, min = 15.0s, avg = 26.3s, dev = 9.0s //tensorflow/python/kernel_tests/control_flow:control_flow_ops_py_test_cpu PASSED in 38.1s Stats over 16 runs: max = 38.1s, min = 6.8s, avg = 11.7s, dev = 7.4s //tensorflow/python/kernel_tests/linalg:matrix_exponential_op_test PASSED in 12.3s Stats over 16 runs: max = 12.3s, min = 4.6s, avg = 6.3s, dev = 2.0s //tensorflow/python/kernel_tests/signal:dct_ops_test_cpu PASSED in 19.9s Stats over 16 runs: max = 19.9s, min = 8.6s, avg = 11.4s, dev = 2.7s //tensorflow/python/ops:image_ops_test_cpu PASSED in 20.8s Stats over 16 runs: max = 20.8s, min = 7.8s, avg = 15.3s, dev = 3.5s //tensorflow/python/data/experimental/kernel_tests/service:distributed_save_ft_test FLAKY, failed in 1 out of 18 in 900.1s Stats over 18 runs: max = 900.1s, min = 10.3s, avg = 82.5s, dev = 201.0s /home/buildslave/.cache/bazel/_bazel_buildslave/fbac33eb30dbfb6b11b15a7ff5ac830d/execroot/org_tensorflow/bazel-out/aarch64-opt/testlogs/tensorflow/python/data/experimental/kernel_tests/service/distributed_save_ft_test/shard_1_of_17/test_attempts/attempt_1.log //tensorflow/python/data/kernel_tests:map_test PASSED in 80.5s Stats over 19 runs: max = 80.5s, min = 36.0s, avg = 58.3s, dev = 10.6s //tensorflow/compiler/tests:pooling_ops_3d_test_cpu PASSED in 9.6s Stats over 20 runs: max = 9.6s, min = 3.4s, avg = 4.1s, dev = 1.3s //tensorflow/compiler/tests:pooling_ops_3d_test_cpu_mlir_bridge_test PASSED in 10.3s Stats over 20 runs: max = 10.3s, min = 3.8s, avg = 4.7s, dev = 1.4s //tensorflow/compiler/tests:pooling_ops_test_cpu PASSED in 11.6s Stats over 20 runs: max = 11.6s, min = 4.4s, avg = 5.7s, dev = 1.8s //tensorflow/compiler/tests:pooling_ops_test_cpu_mlir_bridge_test PASSED in 12.0s Stats over 20 runs: max = 12.0s, min = 3.8s, avg = 6.1s, dev = 2.2s //tensorflow/compiler/tests:stochastic_cast_op_test_cpu PASSED in 10.7s Stats over 20 runs: max = 10.7s, min = 4.8s, avg = 6.0s, dev = 1.2s //tensorflow/compiler/tests:unary_ops_test_cpu PASSED in 30.2s Stats over 20 runs: max = 30.2s, min = 4.2s, avg = 8.7s, dev = 6.0s //tensorflow/compiler/tests:unary_ops_test_cpu_mlir_bridge_test PASSED in 32.8s Stats over 20 runs: max = 32.8s, min = 4.4s, avg = 10.7s, dev = 7.9s //tensorflow/dtensor/python/tests:rng_test_cpu PASSED in 87.2s Stats over 20 runs: max = 87.2s, min = 6.7s, avg = 12.1s, dev = 17.3s //tensorflow/python/autograph/tests:loop_control_flow_test PASSED in 29.0s Stats over 20 runs: max = 29.0s, min = 12.2s, avg = 15.5s, dev = 3.6s //tensorflow/python/kernel_tests:metrics_test PASSED in 56.9s Stats over 20 runs: max = 56.9s, min = 10.1s, avg = 26.1s, dev = 15.3s //tensorflow/python/kernel_tests/array_ops:matrix_band_part_op_test_cpu PASSED in 12.4s Stats over 20 runs: max = 12.4s, min = 3.5s, avg = 4.6s, dev = 1.8s //tensorflow/python/kernel_tests/data_structures:barrier_ops_test PASSED in 16.3s Stats over 20 runs: max = 16.3s, min = 4.4s, avg = 7.3s, dev = 3.5s //tensorflow/python/kernel_tests/linalg:eig_op_test PASSED in 60.2s Stats over 20 runs: max = 60.2s, min = 4.5s, avg = 18.2s, dev = 18.4s //tensorflow/python/kernel_tests/linalg:linalg_grad_test_cpu PASSED in 158.9s Stats over 20 runs: max = 158.9s, min = 39.3s, avg = 84.1s, dev = 34.3s //tensorflow/python/kernel_tests/linalg:norm_op_test_cpu PASSED in 12.8s Stats over 20 runs: max = 12.8s, min = 6.4s, avg = 7.9s, dev = 1.4s //tensorflow/python/kernel_tests/linalg:normalize_op_test_cpu PASSED in 16.7s Stats over 20 runs: max = 16.7s, min = 7.3s, avg = 11.2s, dev = 2.7s //tensorflow/python/kernel_tests/linalg:qr_op_test_cpu PASSED in 182.4s Stats over 20 runs: max = 182.4s, min = 44.4s, avg = 96.0s, dev = 39.2s //tensorflow/python/kernel_tests/linalg:self_adjoint_eig_op_test_cpu PASSED in 28.4s Stats over 20 runs: max = 28.4s, min = 5.1s, avg = 13.1s, dev = 8.4s //tensorflow/python/kernel_tests/math_ops:batch_matmul_op_test_cpu PASSED in 22.6s Stats over 20 runs: max = 22.6s, min = 9.0s, avg = 14.2s, dev = 3.9s //tensorflow/python/kernel_tests/math_ops:matmul_op_test_cpu PASSED in 26.7s Stats over 20 runs: max = 26.7s, min = 16.1s, avg = 21.6s, dev = 3.0s //tensorflow/python/kernel_tests/math_ops:tensordot_op_test_cpu PASSED in 81.0s Stats over 20 runs: max = 81.0s, min = 6.9s, avg = 36.9s, dev = 24.9s //tensorflow/python/kernel_tests/nn_ops:embedding_ops_test_cpu PASSED in 38.0s Stats over 20 runs: max = 38.0s, min = 11.4s, avg = 16.3s, dev = 5.3s //tensorflow/python/data/kernel_tests:interleave_test PASSED in 41.0s Stats over 24 runs: max = 41.0s, min = 11.4s, avg = 23.5s, dev = 7.9s //tensorflow/python/data/kernel_tests:sample_from_datasets_test PASSED in 31.3s Stats over 24 runs: max = 31.3s, min = 5.0s, avg = 12.5s, dev = 7.6s //tensorflow/dtensor/python/tests:multi_device_spmd_test_cpu PASSED in 66.9s Stats over 25 runs: max = 66.9s, min = 19.7s, avg = 28.3s, dev = 8.7s //tensorflow/python/kernel_tests/nn_ops:conv_ops_3d_test_cpu PASSED in 46.7s Stats over 30 runs: max = 46.7s, min = 4.3s, avg = 13.8s, dev = 10.3s //tensorflow/python/data/experimental/kernel_tests/service:data_service_ops_test PASSED in 19.6s Stats over 32 runs: max = 19.6s, min = 4.2s, avg = 9.1s, dev = 4.3s //tensorflow/python/data/experimental/kernel_tests/service:worker_tags_test PASSED in 23.9s Stats over 32 runs: max = 23.9s, min = 4.4s, avg = 9.4s, dev = 4.1s //tensorflow/python/distribute:multi_process_runner_test_2gpu PASSED in 218.1s Stats over 35 runs: max = 218.1s, min = 4.9s, avg = 24.3s, dev = 37.5s //tensorflow/python/distribute:multi_process_runner_test_cpu PASSED in 219.8s Stats over 35 runs: max = 219.8s, min = 4.7s, avg = 24.8s, dev = 37.6s //tensorflow/core/kernels:stochastic_cast_op_test PASSED in 1.7s Stats over 48 runs: max = 1.7s, min = 0.4s, avg = 0.5s, dev = 0.3s //tensorflow/compiler/mlir/quantization/tensorflow/python:quantize_model_test PASSED in 94.2s Stats over 50 runs: max = 94.2s, min = 24.4s, avg = 44.9s, dev = 13.8s //tensorflow/compiler/tests:sort_ops_test_cpu PASSED in 30.6s Stats over 50 runs: max = 30.6s, min = 3.9s, avg = 10.9s, dev = 5.7s //tensorflow/compiler/tests:sort_ops_test_cpu_mlir_bridge_test PASSED in 13.1s Stats over 50 runs: max = 13.1s, min = 3.2s, avg = 7.4s, dev = 2.6s //tensorflow/python/kernel_tests/linalg:linear_operator_circulant_test_cpu PASSED in 79.8s Stats over 50 runs: max = 79.8s, min = 45.2s, avg = 61.9s, dev = 8.5s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_dense_mat_mul_grad_test_cpu PASSED in 17.4s Stats over 50 runs: max = 17.4s, min = 5.6s, avg = 10.2s, dev = 3.1s //tensorflow/python/kernel_tests/linalg/sparse:csr_sparse_matrix_dense_mat_mul_onednn_grad_test PASSED in 17.0s Stats over 50 runs: max = 17.0s, min = 6.0s, avg = 10.7s, dev = 3.2s //tensorflow/python/kernel_tests/math_ops:cwise_ops_binary_test_cpu PASSED in 44.7s Stats over 50 runs: max = 44.7s, min = 9.8s, avg = 20.7s, dev = 7.7s //tensorflow/python/kernel_tests/math_ops:cwise_ops_test_cpu PASSED in 13.6s Stats over 50 runs: max = 13.6s, min = 3.7s, avg = 5.9s, dev = 1.7s Executed 3077 out of 3077 tests: 3077 tests pass. There were tests whose specified size is too big. Use the --test_verbose_timeout_warnings command line option to see which ones these are.