logo
down
shadow

Using Presto on Cloud Dataproc with Google Cloud SQL?


Using Presto on Cloud Dataproc with Google Cloud SQL?

By : Yuvraj Prajapati
Date : November 23 2020, 09:01 AM
To fix the issue you can do The easiest way to do this is to edit the initialization action installing Presto on the Cloud Dataproc cluster.
Cloud SQL setup
code :
cat > presto-server-${PRESTO_VERSION}/etc/catalog/hive.properties <<EOF
connector.name=hive-hadoop2
hive.metastore.uri=thrift://localhost:9083
EOF
cat > presto-server-${PRESTO_VERSION}/etc/catalog/mysql.properties <<EOF
connector.name=mysql
connection-url=jdbc:mysql://<ip_address>:3306
connection-user=<username>
connection-password=<password>
EOF


Share : facebook icon twitter icon
How can I run Presto on Google Cloud Dataproc?

How can I run Presto on Google Cloud Dataproc?


By : Elizabeth Peñaranda
Date : November 23 2020, 09:01 AM
help you fix your problem You can use an initialization action with a Cloud Dataproc cluster to quickly install and configure Presto. Specifically, there is a GitHub repository with initialization actions. There is a Presto initialization action which lets you quickly install and configure Presto.
If you want to use the Presto WebUI, once the cluster is online you can follow these directions to create an SSH tunnel and SOCKS proxy to the cluster. From there, you can access Presto (by default, unless you change it) on port 8080 on the master node.
how to set CPUS quota for Google Cloud Dataproc via Cloud Composer?

how to set CPUS quota for Google Cloud Dataproc via Cloud Composer?


By : Shabnam Sayyahi
Date : March 29 2020, 07:55 AM
To fix the issue you can do Dataproc worker machines are in fact Compute Engine VMs, so CPU quotas applies to Compute Engine API.
CPU quotas are not related to Airflow/Google Cloud Composer and cannot be configured from there. DataprocClusterCreateOperator simply calls Dataproc APIs, which at their turn starts VMs on Compute Engine.
Google Cloud dataproc not able to access hive metastore from cloudsql with --scopes=cloud-platform

Google Cloud dataproc not able to access hive metastore from cloudsql with --scopes=cloud-platform


By : habby
Date : March 29 2020, 07:55 AM
Hope that helps The --initialization-actions flag requires a comma-separated list rather than repeating the flag to append multiple initialization actions to the list. Try
code :
--initialization-actions=gs://dataproc-initialization-actions/cloud-sql-proxy/cloud-sql-proxy.sh,gs://dataproc-initialization-actions/presto/presto.sh
Authentication errors with Cloud Dataproc and other Google Cloud products

Authentication errors with Cloud Dataproc and other Google Cloud products


By : aherb92
Date : March 29 2020, 07:55 AM
this one helps. Google Cloud Dataproc includes some authentication scopes by default but does not presently include scopes for all Google Cloud Platform products. You can add scopes to a cluster by creating them with the Google Cloud SDK and using the --scopes flag.
For example, you can use the following flag when using the gcloud beta dataproc clusters create command to add the PubSub scope --scopes https://www.googleapis.com/auth/pubsub. As long as the service handles the "catch all" scope, you can use --scopes https://www.googleapis.com/auth/cloud-platform to add scopes for many services at once.
Google DataProc Presto: how to write Presto query results to google cloud storage?

Google DataProc Presto: how to write Presto query results to google cloud storage?


By : user6018356
Date : March 29 2020, 07:55 AM
fixed the issue. Will look into that further I have a DataProc cluster with Presto installed as an optional component. My data is stored in google cloud storage (GCS) and I'm able to query it with Presto. However, I didn't find a way to write the query result back to GCS. I can write to hdfs if I logged in to master node and run Presto commands from there, but it doesn't recognize any GCS location. , You need to create a Hive external table backed by GCS, for example:
code :
gcloud dataproc jobs submit hive \
    --cluster <cluster> \
    --execute "
        CREATE EXTERNAL TABLE my_table(id  INT, name  STRING)
        STORED AS PARQUET
        location 'gs://<bucket>/<dir>/';"
Related Posts Related Posts :
  • Ignore whitespace in Xtext rule
  • ServiceStack Ormlite: Circular reference between parent and child tables prevents foreign key creation
  • Can't connect to MobileFirst 7.1 server
  • See parameters that are overridden from TeamCity template
  • Can we send collection of messages in QuickBlox using Android SDK
  • SqlFileStream: Returning stream vs byte array in HTTP response
  • tvos: How should we handle low resolution monitor? like 1366x768
  • Aggregation binding only shows last item
  • Gitlab CI artifacts crashes with 403
  • InvalidSessionDescriptionError: Invalid description, no ice-ufrag attribute
  • Missing ionic.project file
  • ispConfig soap client functions of billing module does not exist
  • How to check for dynamic element names in a typeswitch expression?
  • braintree payments integration with zf2( zend framework 2 )
  • Sitecore 8 Admin role: Lock access
  • freemarker looping sequence error
  • How to set multiple commands in one yaml file with Kubernetes?
  • Quartz composer - output specific number
  • make gdb load a shared library from a specific path
  • ADD A COLUMN WITH SR.NO in Sap.m.table irrespective of other columns
  • Can I use SPARQL to query DBPedia for information about Wiki pages such as page length or number of times an article was
  • Jaro Similarity
  • How can I share sessions between Chrome and Paw?
  • how to start developing with OpenText CASE360
  • How to find relation between send and received message in twillio
  • Solve ~(P /\ Q) |- Q -> ~P in Isabelle
  • JetBrains Resharper 9 Ultimate Test Runner error: NUnit.Core.UnsupportedFrameworkException: Skipped loading assembly {My
  • Which RFID and RFID Reader to use?
  • wmi call returning Unexpected COM Error error
  • Training model ignored by stanford CoreNLP
  • z3: Is it possible to adjust the branching heuristics in Z3?
  • SAPUI5_JSON Data binding issue
  • Why does my protractor test have "no specs found" when I include jasmine-reporters in my config file?
  • How to remove "OK" button from Dialog fragment in Android
  • MobileFirst 7.1 connectOnStartup & WL.Client.connect different
  • OrientDB Fetch Plan/Strategies with Tinkerpop
  • Release memory from ID3D11Device::CreateBuffer(...)
  • Samsung SDK: how to install app through apache server and view logs in console?
  • Silex - Redirecting to home page if url not found
  • Convert a TIME8. to a Character Without First Converting to Numeric Format
  • ImageMagick, Can ImageMagick return single annotation as a bitmap?
  • Block access to some LAN ip`s using PFsense
  • noVNC Multiple Localhost Servers
  • What casts are allowed with `as`?
  • Google Drive API append file?
  • nix-env -qa not showing latest packages
  • In TI-BASIC, how do I add a variable in the middle of a String?
  • NetBeans - Display .gitignore Files in Projects/Files
  • Why is my command prompt freezing on Windows 10?
  • pass python arguments with argument name
  • Storing a time stamp(Calendar object) with objectify
  • XSLT to copy element without default/old namespace
  • Spark: join key-tuple pairs into key-list value
  • RethinkDB: Get last N from an object
  • How to direct my index to MediaWiki index.php
  • Removing ExecControl to upgrade to Ratpack v1.1.1?
  • When registering a table using the %pyspark interpreter in Zeppelin, I can't access the table in %sql
  • Phaser Sprite for joint between two bodies
  • The system detected a protection exception
  • OpenCL cannot find GPU device: NVIDIA GPU (Quadro K4000) + Visual Studio 2015
  • shadow
    Privacy Policy - Terms - Contact Us © animezone.co