I would like to test interactions between two logged in users on a website that only allows a single user to be logged in for a given browser session. The reason I’d like to avoid creating a second driver is because I’d prefer to use Selenium’s provided Docker containers to run my tests (https://github.com/SeleniumHQ/docker-selenium), which ..
Category : selenium
I am trying to run selenium tests locally. I have both Selenium Grid / and Fronend set up with docker-compose. However I am having trouble accessing frontend ports as browser.get(‘http://localhost:8000/upload’). version: "3.7" services: chrome: image: selenium/node-chrome:4.0.0-beta-3-prerelease-20210402 volumes: – /dev/shm:/dev/shm depends_on: – selenium-hub environment: – SE_EVENT_BUS_HOST=selenium-hub – SE_EVENT_BUS_PUBLISH_PORT=4442 – SE_EVENT_BUS_SUBSCRIBE_PORT=4443 ports: – "6900:5900" selenium-hub: image: selenium/hub:4.0.0-beta-3-prerelease-20210402 ..
I’m using selenium in docker and it worked fine, but today something went wrong I have this code in my docker file to install latest versions of chrome and chromedriver: # install google chrome RUN wget -q -O – https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add – RUN sh -c ‘echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google-chrome.list’ ..
I’m trying to run my tests on Atlassian Bamboo Server(version 7.2.2), but it impossible to do because they contain Selenium – "Bamboo can’t open a real browser (IE, Firefox, chrome..)"[source]. So, I started Selenium Grid on my localhost using dockers selenium/hub and selenium/node-chrome-browser as it describes in this tutorial. Main commands to run Selenium Grid ..
I have the same problem described (and solved!) at https://stackoverflow.com/a/48267887/1288109. The only difference is that my whole environment (Python, Selenium, Chrome) is hosted in Docker. When running the JS: return driver.execute_script(""" var items = document .querySelector(‘downloads-manager’) .shadowRoot … I get a JavascriptException saying "javascript error: Cannot read property ‘shadowRoot’ of undefined". So, document.querySelector(‘downloads-manager’) yields undefined. ..
I’m using selenium grid in docker. I want to login to website that I want to scrap before run in parallel many tasks. In general I want do login then keep auth data and share between all my nodes. Is it possible or maybe there is some nicer way to do this? Because when I ..
We have our end2end tests written in node/protractor and like to execute them inside docker. Unfortunately, this fails as it seems chrome crashes immediately after starting. This is the log of the docker run process Google Chrome 89.0.4389.114 Webdriver-manager has started – give her some time [11:54:09] I/config_source – curl -o/usr/local/lib/node_modules/protractor/node_modules/webdriver-manager/selenium/chrome-response.xml https://chromedriver.storage.googleapis.com/ [11:54:09] I/config_source – ..
I am trying to run a python scraping script in a Docker container. I am using (er, trying to use) the selenium/standalone-chrome image, available here. I’ve seen varying options – just use a Dockerfile like below, or using a docker-compose.yml file to string together the selenium image and a separate container. I.e., the FROM statement ..
I am running headless chrome using python and selenium. For my automated tests we are testing on a non-production environment which is not accessible without installing a .crt cert on your machine to: Current User>Trusted Root Certificate Authority In the Dockerfile I am installing the certs as below: FROM selenium/standalone-chrome-debug:latest RUN sudo apt-get update && ..
Going off this SO question/answer, and a few others, I’ve been trying to get scraping function below to work in a Docker container. Right now, the Dockerfile is installing selenium from my requirements.txt file and I have the chromedriver extension uploading in the project directory. I don’t think this is the right way, and I’ve ..