Category : scala

I’ve created a Docker image to host a Scala application, where the application loads its database-connection information from a properties file located in its directory: driverClassName=org.postgresql.Driver jdbcUri=jdbc:postgresql://localhost:5432/pedscreen user=$APP_USER password=$APP_PASSWORD I’ve also created a Docker image to host a Postgres database. The docker-compose.yaml file: version: "3.7" services: database: image: postgres:9.6.22 container_name: "postgres" environment: POSTGRES_PASSWORD: ‘XXXXXXXXXX’ volumes: ..

Read more

I have an application which connect to Keycloak server. I want to use AdapterConfig in KeyCloakDeploymentBuilder. def createAdapterConfig(): AdapterConfig = { val adapterConfig = new AdapterConfig val credentials = new util.HashMap[String, AnyRef] credentials.put("secret", keyCloakParams.clientSecret) adapterConfig.setRealm(keyCloakParams.realm) adapterConfig.setResource(keyCloakParams.clientID) adapterConfig.setAuthServerUrl(keyCloakParams.host + "/auth/") adapterConfig.setSslRequired(keyCloakParams.sslRequired) adapterConfig.setPublicClient(keyCloakParams.publicClient) adapterConfig.setConfidentialPort(keyCloakParams.confidentialPort) adapterConfig.setCredentials(credentials) adapterConfig } private val keycloakDeployment: KeycloakDeployment = KeycloakDeploymentBuilder.build(createAdapterConfig()) And it works.. locally. ..

Read more

I have spark running on one container and cassandra running on another. These are the images i have used for spark – jupyter/all-spark-notebook cassandra – cassandra:latest Scala version in the spark container is 2.12 and spark 3.0.2. I have added the following line in the spark-defaults.conf file spark.jars.packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.2 Both of them are configured on ..

Read more

We are migrating our Spark workloads from Cloudera to Kubernetes. For demo purposes, we wish to run one of our spark jobs within a minikube cluster using spark-submit in cluster mode. I would like to pass a typesafe config file to my executors using the spark.file conf (I tried –files as well). The configuration file ..

Read more

We are migrating our Spark workloads from Cloudera to Kubernetes. For demo purposes, we wish to run one of our spark jobs within a minikube cluster using spark-submit in cluster mode. I would like to pass a typesafe config file to my executors using the spark.file conf (I tried –files as well). The configuration file ..

Read more

I am trying to setup and use a local Spark development environment using docker-compose. I created a docker-compose stack made up of 3 services: a Spark master using bitnami/spark one Spark worker using bitnami/spark a development container with Scala and sbt (based on hseeberger/scala-sbt) Working in the development container, I created a very simple HelloWorld ..

Read more

TL;DR: How to fix the java.lang.IllegalStateException: Cannot find any build directories. error while submitting spark job in a standalone cluster. I package a Spark application inside a docker image with sbt-native-packager. This leads to an image with all required jars: docker run –rm -it –entrypoint ls myimage:latest -l lib total 199464 […] -r–r–r– 1 demiourgos728 ..

Read more

I am trying to read records from one oracle table and dump it into another oracle table. I am new to Spark- Scala tech stack.Below is the code for the same. This code is running forever on my local machine. package com.demo.spark import java.util.logging.Logger import org.apache.spark.sql.{SaveMode, SparkSession} object HelloSpark extends Serializable{ @transient lazy val logger: ..

Read more

I’m trying to run a spark application on k8s and it fails immediately after container creation. No specific error. I see only error code 101 Container name: spark-kubernetes-driver Container image: image URL Container state: Terminated Exit code: 101 Driver logs + CMD=("$SPARK_HOME/bin/spark-submit" –conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" –deploy-mode client "[email protected]") + exec /usr/bin/tini -s — /opt/spark/bin/spark-submit –conf spark.driver.bindAddress=10.1.0.10 ..

Read more