TalkingQuickly's Today I Learned

10 posts about #devops

Who has access to which schemas in Postgres

Great little snippet for understanding which users have which privileges on a schema in Postgres / Redshift.

SELECT *
FROM
    (
    SELECT
        schemaname
        ,objectname
        ,usename
        ,HAS_TABLE_PRIVILEGE(usrs.usename, fullobj, 'select') AND has_schema_privilege(usrs.usename, schemaname, 'usage')  AS sel
        ,HAS_TABLE_PRIVILEGE(usrs.usename, fullobj, 'insert') AND has_schema_privilege(usrs.usename, schemaname, 'usage')  AS ins
        ,HAS_TABLE_PRIVILEGE(usrs.usename, fullobj, 'update') AND has_schema_privilege(usrs.usename, schemaname, 'usage')  AS upd
        ,HAS_TABLE_PRIVILEGE(usrs.usename, fullobj, 'delete') AND has_schema_privilege(usrs.usename, schemaname, 'usage')  AS del
        ,HAS_TABLE_PRIVILEGE(usrs.usename, fullobj, 'references') AND has_schema_privilege(usrs.usename, schemaname, 'usage')  AS ref
    FROM
        (
        SELECT schemaname, 't' AS obj_type, tablename AS objectname, schemaname + '.' + tablename AS fullobj FROM pg_tables
        WHERE schemaname not in ('pg_internal')
        UNION
        SELECT schemaname, 'v' AS obj_type, viewname AS objectname, schemaname + '.' + viewname AS fullobj FROM pg_views
        WHERE schemaname not in ('pg_internal')
        ) AS objs
        ,(SELECT * FROM pg_user) AS usrs
    ORDER BY fullobj
    )
WHERE (sel = true or ins = true or upd = true or del = true or ref = true)
and schemaname='SOME_SCHEMA_NNAME';

Note that if you can't track down why a user has a particular permission, it's worth delving into the permissions granted to the public role, more about that here: https://blog.dbrhino.com/locking-down-permissions-in-postgresql-and-redshift.html

Calculating distance in Redshift

Execute the below in Redshift as an admin to create a function for calculating the distance in miles between two lat lngs, saves having lots of queries full of trig. Can get to km if needed by multiplying by 1.609344.

------- DISTANCE_IN_MILES FUNCTION ---------
CREATE FUNCTION DISTANCE_IN_MILES (orig_lat float, orig_long float, dest_lat float, dest_long float)
  RETURNS float
STABLE
AS $$
  import math
  r = 3963.1676
  phi_orig = math.radians(orig_lat)
  phi_dest = math.radians(dest_lat)
  delta_lat = math.radians(dest_lat - orig_lat)
  delta_long = math.radians(dest_long - orig_long)
  a = math.sin(delta_lat/2) * math.sin(delta_lat/2) + math.cos(phi_orig) \
      * math.cos(phi_dest) * math.sin(delta_long/2) * math.sin(delta_long/2)
  c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
  d = r * c
  return d
$$ LANGUAGE plpythonu;

From: https://stackoverflow.com/questions/25316193/postgres-sql-amazon-redshift-how-to-calculate-distance-between-two-latitude

Cloudflare Free SSL on Heroku

It's been bugging me for ages that I could only access this TIL instance via the Heroku URL, not the til.talkingquickly.co.uk subdomain. Turns out that for SSL to be terminated by Cloudflare, rather than using the custom subdomain CNAME from Heroku, e.g. yourdomain.herokudns.com, as a CNAME entry in Cloudflare, instead you should use the Heroku provided URL, e.g. YOURAPP.herokuapp.com.

From: https://thoughtbot.com/blog/set-up-cloudflare-free-ssl-on-heroku

Creating a read only user for Postgres

One of my most common DBA tasks is adding read only users for hooking up analytics systems, this is the most concise approach I've found to this. Since Redshift mirrors Postgres for most of these things, the same approach should also work on Redshift.

-- Create a group
CREATE ROLE readaccess;

-- Grant access to existing tables
GRANT USAGE ON SCHEMA public TO readaccess;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO readaccess;

-- Grant access to future tables
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO readaccess;

-- Create a final user with password
CREATE USER some_username WITH PASSWORD 'secret';
GRANT readaccess TO some_username;

This came from this gist: https://gist.github.com/oinopion/4a207726edba8b99fd0be31cb28124d0

Mismatched kubeadm versions & kubeadm join

When joining a cluster with versions of kubeadm and/ or kubelet which don't match the master, you get an error like:

[kubelet] Downloading configuration for the kubelet from the "kubelet-config-1.11" ConfigMap in the kube-system namespace
configmaps "kubelet-config-1.11" is forbidden: User "system:bootstrap:tn8mhk" cannot get configmaps in the namespace "kube-system"

This can be fixed by installing matching versions (e.g. apt-cache madison kubeadm then apt-get install kubeadm=OUTPUT_FROM_APT-CACHE

Use apt-cache policy PACKAGE_NAME to get the currently installed version on the master.

Getting Minikube running on Windows 10

Not quite as simple as it looks, you need to:

  • Enable HyperV in Win10
  • Create a new Virtual Switch called Primary Virtual Switch

Then init Minikube with:

minikube start --vm-driver hyperv --hyperv-virtual-switch "Primary Virtual Switch"

Thanks to this writeup post for getting me on-track.

Running docker-compose container with TTY

If you want to run a service from a docker-compose file, exposing it's ports but also attaching to a TTY, for example running rails server while using pry to debug, use the following:

docker-compose run --service-ports app

Finding Large Directories on Ubuntu

A very quick and dirty way to find large directories, e.g. when out of disk space on an ubuntu server is this:

du -h --max-depth=1 /folder1/folder2

And then just drilling down, e.g. if folder3 shows up as being the largest folder in the output of the above command then do

du -h --max-depth=1 /folder1/folder2/folder3

You can also sort it if you remove -h and then pipe to sort -n e.g.

du -h --max-depth=1 /folder1/folder2 | sort -n

But haven't yet got that to work nicely with humanised file sizes (e.g. showing in M/G instead of bytes).

Easy SSH Tunnel to any service

To connect to a service on a remote machine which is behind a firewall, as long as there's SSH access, this can be done with:

ssh -L LOCALPORT:localhost:REMOTEPORT -p 22 [email protected]

Then simply connect to the service using port on localhost where 22 is the port SSH is running on remotely.

The AWS CLI is Awesome

Install (OSX) with brew install awscli and configure with aws configure. Enter credentials + default region (this should be e.g. eu-west-1 not Ireland).

To get the size of a bucket:

aws s3api list-objects --bucket BUCKET_NAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

To download a local dump of a bucket:

aws s3 sync s3://BUCKET_NAME_OR_PATH .