org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 102.0 failed 4 times, most recent failure: Lost task 5.3 in stage 102.0 (TID 895) (ithdp-nxcals8002.cern.ch executor 6): java.lang.ArrayIndexOutOfBoundsException Driver stacktrace: ...
Traceback (most recent call last): File "/opt/acc-bpt/release_2025_01/acc_bpt/shared/BptV2.py", line 288, in build rawdata = self.bptStatic.fetch(vars, times) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/acc_bpt/shared/utils_storage.py", line 158, in wrapper_incrementally_cached return FetchCacher.fetch_caching_logic(fetch_self=fetch_self, fetch_func=fetch_func, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/acc_bpt/shared/utils_storage.py", line 195, in fetch_caching_logic res_data = fetch_func(fetch_self, var_defs, time_ranges, True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/acc_bpt/leir/early/injection_line_bpm_2.py", line 83, in fetch pdf = self.nxcals.spark2pandas(res_filtered) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/acc_bpt/shared/utils_nxcals.py", line 564, in spark2pandas pdf = spark_dataset.toPandas() ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/venv/lib/python3.11/site-packages/pyspark/sql/pandas/conversion.py", line 131, in toPandas batches = self._collect_as_arrow(split_batches=self_destruct) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/venv/lib/python3.11/site-packages/pyspark/sql/pandas/conversion.py", line 284, in _collect_as_arrow jsocket_auth_server.getResult() File "/opt/acc-bpt/release_2025_01/venv/lib/python3.11/site-packages/py4j/java_gateway.py", line 1322, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/venv/lib/python3.11/site-packages/pyspark/errors/exceptions/captured.py", line 179, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/venv/lib/python3.11/site-packages/py4j/protocol.py", line 326, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling o1677.getResult. : org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) at org.apache.spark.security.SocketAuthServer.getResult(SocketAuthServer.scala:98) at org.apache.spark.security.SocketAuthServer.getResult(SocketAuthServer.scala:94) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) at py4j.ClientServerConnection.run(ClientServerConnection.java:106) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 102.0 failed 4 times, most recent failure: Lost task 5.3 in stage 102.0 (TID 895) (ithdp-nxcals8002.cern.ch executor 6): java.lang.ArrayIndexOutOfBoundsException Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) Caused by: java.lang.ArrayIndexOutOfBoundsException During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/acc-bpt/release_2025_01/jinja/build_static_plots.py", line 85, in _create_static_html_content output_html_file_path = driver.build(abs_plot_dest_file_html) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/acc-bpt/release_2025_01/acc_bpt/shared/BptV2.py", line 308, in build raise RuntimeError(cause) RuntimeError: org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 102.0 failed 4 times, most recent failure: Lost task 5.3 in stage 102.0 (TID 895) (ithdp-nxcals8002.cern.ch executor 6): java.lang.ArrayIndexOutOfBoundsException Driver stacktrace:
Generated 2025-02-27 23:19:40.973991+01:00