Feed powered byStream Blue Logo Copy 5
Google Analytics

Google Analytics

Utilities / Analytics / General Analytics

Decision at Dubsmash about Google BigQuery, Amazon SQS, AWS Lambda, Amazon Kinesis, Google Analytics, BigDataAsAService, RealTimeDataProcessing, GeneralAnalytics, ServerlessTaskProcessing

Avatar of tspecht
‎Co-Founder and CTO at Dubsmash ·
Google BigQueryGoogle BigQuery
Amazon SQSAmazon SQS
AWS LambdaAWS Lambda
Amazon KinesisAmazon Kinesis
Google AnalyticsGoogle Analytics
#BigDataAsAService
#RealTimeDataProcessing
#GeneralAnalytics
#ServerlessTaskProcessing

In order to accurately measure & track user behaviour on our platform we moved over quickly from the initial solution using Google Analytics to a custom-built one due to resource & pricing concerns we had.

While this does sound complicated, it’s as easy as clients sending JSON blobs of events to Amazon Kinesis from where we use AWS Lambda & Amazon SQS to batch and process incoming events and then ingest them into Google BigQuery. Once events are stored in BigQuery (which usually only takes a second from the time the client sends the data until it’s available), we can use almost-standard-SQL to simply query for data while Google makes sure that, even with terabytes of data being scanned, query times stay in the range of seconds rather than hours. Before ingesting their data into the pipeline, our mobile clients are aggregating events internally and, once a certain threshold is reached or the app is going to the background, sending the events as a JSON blob into the stream.

In the past we had workers running that continuously read from the stream and would validate and post-process the data and then enqueue them for other workers to write them to BigQuery. We went ahead and implemented the Lambda-based approach in such a way that Lambda functions would automatically be triggered for incoming records, pre-aggregate events, and write them back to SQS, from which we then read them, and persist the events to BigQuery. While this approach had a couple of bumps on the road, like re-triggering functions asynchronously to keep up with the stream and proper batch sizes, we finally managed to get it running in a reliable way and are very happy with this solution today.

#ServerlessTaskProcessing #GeneralAnalytics #RealTimeDataProcessing #BigDataAsAService

14 upvotes·616 views

Decision at Dubsmash about Pushwoosh, Google Analytics, Communications, GeneralAnalytics, Analytics, WebPushNotifications

Avatar of tspecht
‎Co-Founder and CTO at Dubsmash ·
PushwooshPushwoosh
Google AnalyticsGoogle Analytics
#Communications
#GeneralAnalytics
#Analytics
#WebPushNotifications

We used Google Analytics to track user and market growth and Pushwoosh to send out push notifications by hand to promote new content. Even though we didn’t localize our pushes at all, we added custom tags to devices when registering with the service so we could easily target certain markets (e.g. send a push to German users only), which was totally sufficient at the time.

#WebPushNotifications #Analytics #GeneralAnalytics #Communications

12 upvotes·464 views

Decision at StackShare about Google Analytics, Segment, Amplitude, Analyticsstack, Analytics, FunnelAnalysisAnalytics

Avatar of yonasb
CEO at StackShare ·
Google AnalyticsGoogle Analytics
SegmentSegment
AmplitudeAmplitude
#Analyticsstack
#Analytics
#FunnelAnalysisAnalytics

Adopting Amplitude was one of the best decisions we've made. We didn't try any of the alternatives- the free tier was really generous so it was easy to justify trying it out (via Segment). We've had Google Analytics since inception, but just for logged out traffic. We knew we'd need some sort of #FunnelAnalysisAnalytics solution, so it came down to just a few solutions.

We had heard good things about Amplitude from friends and even had a consultant/advisor who was an Amplitude pro from using it as his company, so he kinda convinced us to splurge on the Enterprise tier for the behavioral cohorts alone. Writing the queries they provide via a few clicks in their UI would take days/weeks to craft in SQL. The behavioral cohorts allow us to create a lot of useful retention charts.

Another really useful feature is kinda minor but kinda not. When you change a saved chart, a new URL gets generated and is visible in your browser (chartURL/edit) and that URL is immediately available to share with your team. It may sound inconsequential, but in practice, it makes it really easy to share and iterate on graphs. Only complaint is that you have to explicitly tag other team members as owners of whatever chart you're creating for them to be able to edit it and save it. I can see why this is the case, but more often than not, the people I'm sharing the chart with are the ones I want to edit it 🤷🏾‍♂️

The Engagement Matrix feature is also really helpful (once you filter out the noisy events). Charts and dashboards are also great and make it easy for us to focus on the important metrics. We've been using Amplitude in production for about 6 months now. There's a bunch of other features we don't use regularly like Pathfinder, etc that I personally don't fully understand yet but I'm sure we'll start using them eventually.

Again, haven't tried any of the alternatives like Heap, Mixpanel, or Kissmetrics so can't speak to those, but Amplitude works great for us.

#analytics analyticsstack

3 upvotes·1.5K views

Decision at Carrrot about FullStory, Google Analytics, Adobe Photoshop, Webflow

Avatar of roman_eaton
Product Manager at Carrrot ·
FullStoryFullStory
Google AnalyticsGoogle Analytics
Adobe PhotoshopAdobe Photoshop
WebflowWebflow

We chose Webflow to build up websites faster and to make possible for particular employees to fix some misspellings or add an easy element to the page on their own - it is like Adobe Photoshop. To work with the incoming traffic we use our own product, that I can't pin here. It helps to make nurture visitors from the first session into the signing up and further activation into the product. In addition to @Carrrot we use Google Analytics to traffic source awareness, to monitor customers inside the product FullStory helps is a lot with its fury clicking and abandoned links. Activation and retention are done by our own product through the pop-ups, live chat, and emails that all based on customer behavior.

2 upvotes·1.2K views

Decision about Chartio, Google Analytics

Avatar of legosystem
ChartioChartio
Google AnalyticsGoogle Analytics

Most companies use Google Analytics via the web interface. And although it packs a lot of power and features, it still lacks integration with all other data sources. You can hook Search Console and Adwords but that would be it. However, there is a way you can tap into Google Analytics and overlap it with any other data source you have - from your backend or any other tool you might use. And that is by using the Google Analytics API. Lots of people are familiar with its existence but few are using it. The decision to move and use Google API was based on the shortcomings of the web interface and the ability to collaborate using the same data set, the same view.

So combining Chartio for a good sharable view with Google Analytics API and all the other data we have ( semrush, oncrawl and backend data from several sources) provides a quick view on the KPI we care about and a common view that can be discussed easily - especially for remote teams.

2 upvotes·340 views