AWS has a lot of great services, that is true. But sometimes integrations between them look more like a result of a hackathon project. One of those is the Access logs of the AWS Amplify websites. In this set of posts, I will show you, at first how you can quickly analyze the logs locally with some common tools, and later how you can set up a flow of exporting logs to S3, running SQL queries with Athena, and send automated emails with the daily reports.| www.outcoldman.com
Suppose you want to analyze web traffic properly. A lot of time, you want to know the GEO origin of the requests. It is possible to map IP addresses to Locations. You can use MaxMind GeoIP tables. MaxMind provides a free version of their Lite tables that they update weekly. In this post, we create a Lambda function, which we run on schedule every 7 days (weekly). This function uploads the latest tables (CSV files) to S3 from MaxMind. We use these CSV files as sources for AWS Athena tables to ...| www.outcoldman.com
I have been using Alfred for a long time. If you have not heard about Alfred, the short description is a Spotlight on steroids. It can do a lot, but you have to write your own Workflows or find if somebody already built it to use some of the really cool features. And I wrote some a long time ago, but to be honest, lately, I noticed that I don’t use it much, except to open applications and a few workflows. And the Spotlight improved significantly with the last few versions of MacOS. So I hav...| www.outcoldman.com
We discussed how to download AWS Amplify access logs and analyze them locally in the first part. In this part, we will configure a constant flow of access logs to S3 storage and create AWS Athena tables to analyze the logs. To constantly upload access logs to S3, we will use a Lambda function that we will call hourly. We will partition the data daily, which will allow us to store data for years and efficiently use AWS Athena (See Partitioning Data). Because we will keep data partitioned, we w...| www.outcoldman.com