google-bigquery's questions - Japanese 1answer

0 google-bigquery questions.

I am working with Google Analytics data in BigQuery, looking to aggregate the date of last visit and first visit up to UserID level, however my code is currently returning the max visit date for that ...

I have a Java-based ETL that transfers data from Mysql to Google BigQuery. Now I want to replace Bigquery's destination project, but I'm getting NullPointerException from google's class, which ...

I need to Encrypt the Sensitive fields in the Bq Table but my Loading Is Done through the Dataflow. I thought of 3 Different way to Use it. Encrypt the whole Table using Customer Managed Key and Make ...

Question Is there a more efficient way I can use to streamline a process of uploading csv files to bigquery from a python script or any other way? Description I have 1528596 CSV files which needs ...

We are working on an application that stores requests in one table, and responses in another (of course). We can have multiple responses per request, and we store the request ID on both tables. ...

#standardSQL SELECT DISTINCT geoNetwork.country FROM `bigquery-public-data.google_analytics_sample.ga_sessions_*` WHERE _TABLE_SUFFIX BETWEEN '20170701' AND '20170701' AND hits.hour > '11' I'm ...

I am trying to fetch schema form bigquery table. Given a sample code like from google.cloud import bigquery from google.cloud import storage client = bigquery.Client.from_service_account_json('...

I want to persist data from a publicly-accessible API that returns a list of JSON objects, one for each of the past N events, when called. The structure of the JSON objects is simple and consistent. N ...

When loading data from GCS to BQ, is it possible to set "max_bad_records" option with no limitation like: bq load --max_bad_records=1 \ prj:dst.tbl gs://bkt/very_large_and_very_messy.log \ ./schema/...

I'm encountering following error while running UNIONALL query in Bigquery compose query console(web-UI). Encountered " <ID> "UNIONALL "" at line 23, column 10. Was expecting: <EOF> All I'...

Unnesting hits.customdimension and hits.product.customdimension is inflating the transaction revenue SELECT sum(totals.totalTransactionRevenue)/1000000 as revenue, (SELECT MAX(IF(index=10,...

I am using BigQueryIO.readTableRows().fromQuery(...) to read rows from BigQuery then writing TableRow back to BigQuery using BigQueryIO.writeTableRows(). I have table with correct schema already ...

I want to produce a table output as per the SampleData Table & Output. My SQL Query is as follows but it is not giving me the right result for last three columns SELECT UserLogin, COUNT(...

While running the following code in standard SQL in BigQuery, I'm getting an error stating #standardSQL UPDATE dataset.dataset SET New_column = RIGHT(link_id, LEN(link_id) - 3) WHERE TRUE Error: ...

I use the following SQL query to extract the number of rows in a table (table1) grouped by the RowDate: SELECT RowDate AS Date, Count(RowDate) as NumberRows FROM [project:dataset.table1] GROUP BY ...

I currently use the following legacy SQL query to check how many rows are returned in each table within a data set each day: SELECT Date, table1, table2 FROM (SELECT RowDate AS Date, Count(RowDate) ...

I want to grant a new user bigquery.user access from java api. But somehow not able to do from the documentation provided. There is not much with respect to API calls. Can someone point me at the ...

I am working on nested Google Analytics data to build a query, I need to unnest 3 levels in order to get all the fields I need, but once I have unnested the SUM() of my .totals fields are far too high,...

How would I get the difference between two dates in days and excluding weekends? For context, this is for the purposes of Service Level Agreement (SLA) monitoring between the date that the ticket is ...

Following on from Select first row in each GROUP BY group? I am trying to do a very similar thing in Google big query. Dataset: fh-bigquery:reddit_comments.2018_01 Aim: For each link_id (Reddit ...

I want to compose bigquery in app script with date1 & date2 variable (as mentioned below). What is the format to pass these 2 variable? var date1="20180601" var date2="20180606" var sql = "...

Would someone please let me if there is a way to save the BigQuery Result to JSON or Avro format. I am using following code to run the query on BigQuery Table. client = bigquery.Client....

I have tables in BigQuery which I want to export and import in Datastore. How to achieve that?

I want to extract a table from Bigquery, using python 2.7 and pycharm. I followed the steps proposed by the official google cloud website (https://cloud.google.com/bigquery/docs/exporting-data), but ...

When trying to load data into a big query table, I get an error telling me a row is larger than the maximum allowed size. I could not find this limitation anywhere in the documentation. What is the ...

my question is about a code to extract a table extract a table from Bigquery and save it as a json file . I made my code mostly by following the gcloud tutorials on their documentation. I couldn't ...

I have a simple DAG from airflow import DAG from airflow.contrib.operators.bigquery_operator import BigQueryOperator with DAG(dag_id='my_dags.my_dag') as dag: start = DummyOperator(task_id='...

In our app we collect events everytime push is received (with name "PUSH_RECEIVED". Recently we connected our firebase account to Bigquery to have deeper insight how our users behave. I have feeling ...

I want to find the number of unique users active in the last 30 days. I want to calculate this for today, but also for days in the past. The dataset contains user ids, dates and events triggered by ...

Platform: BigQuery (standard) I have a partitioned table by day (table_name_20180101), and I am trying to write a query so that, any given day, it only works on the previous 7 days tables. (So, for ...

I have exported a table which has a float column to a .csv file on to cloud storage, also created the JSON schema file for the same table. Now, I tried to import the same csv file with bq load command ...

I need to do the following: Upload a .csv file to Google Drive Once in the Google Cloud, the file should be opened by a Javascript function stored in Google Storage to apply changes on some columns. ...

From Google Bigquery documentation: Running parameterized queries BigQuery supports query parameters to help prevent SQL injection when queries are constructed using user input. This feature ...

I am querying data from bigquery using get_data_from_bq method mentioned below in a loop: def get_data_from_bq(product_ids): format_strings = ','.join([("\"" + str(_id) + "\"") for _id in ...

I'm trying to count distinct users that also match a condition (in this example deleted IS NOT TRUE). I need to group by monthly cohorts with users who were active within the target month and users ...

Does BigQuery support materialized views? The documentation suggests to materialize the query output. does this mean materialized views are supported or is this as good as creating a new table with ...

function Master() { var datasetId = 'MasterJoined' var tableId = 'Master'; var job = { configuration: { query: { query: 'SELECT * FROM [BQACCOUNT:SHEETA.SHEETA_History],[BQACCOUNT:SHEETA....

I need to capitalize a string: john doe -> John Doe How to do this? I suppose need to use NORMALIZE_AND_CASEFOLD, but it returns a lower case. https://cloud.google.com/bigquery/docs/reference/...

Does anyone know of any way to remove the public datasets from a BigQuery project? Though the risk is very low, I don't want my users to be able to run queries against them and rack up costs. Thanks

I am querying a table on BigQuery that has a field in the 'DATE' format. I want to read this in the 'TIMESTAMP' format. I tried converting the DATE to an integer and then converting into a TIMESTAMP ...

I'm trying to pull data from a google BigQuery dataset into memory on a GC instance in the same region, but it's taking far too long. A simple query with a single WHERE clause is taking 10 minutes to ...

I cannot set the region for a BigQuery dataset when using Direct Runner using Apache Beam. I'm trying to get data from Oracle via JdbcIO.read using Apache Beam to get data and push it to BigQuery ...

I want to have a left join in bigquery. SELECT id,mtr,name FROM (SELECT userid,mtr,name FROM results_20180612_230337 LEFT JOIN table1 ON id=myid where partitiondate=CAST("2018-05-29" AS DATE)) ...

I've done quite a bit of searching, but haven't been able to find anything within this community that fits my problem. I have a MongoDB collection that I would like to normalize and upload to Google ...

Take for example, event_dim.name = "Start_Level" event_dim.params.key = "Chapter_Name" event_dim.params.value.string_value = "chapter_1" (or "chapter_2" or "chapter_3" and so on) event_dim.params....

I'm trying to use BigQuery's INT64 type to hold bit encoded information. I have to use a javascript udf function and I'd like to use all the 64 bits. My issue is that javascript only deals with int32 ...

Below is my code in Django frame (python 2.7) to list the jobs in Bigquery. I want to filter to just the ones in last two weeks but the min_creation_time in the list_jobs() function does not work and ...

I'm trying to import firestore data in bigquery. Unfortunatly I have some map fields and I'm failing to tell bigquery to treat those fields as json. https://firebase.google.com/docs/firestore/...

I want to use the same function for parsing events in two different technologies: Goolge Bigquery and DataFlow. Is there a language I can do this in? If not, is google planning to support one any ...

Currently I have a BigQuery table that has more that 10,000 rows. For a Machine Learning Algorith that I already have, it ask for the data to had the days of data as columns. So a transpose command ...

Related tags

Hot questions

Language

Popular Tags