Here is the aws_s3.table_import_from_s3 stub that pairs nicely with the previously bloggedaws_s3.query_export_to_s3 stub. CREATESCHEMAIFNOTEXISTSaws_s3;CREATEORREPLACEFUNCTIONaws_s3.table_import_from_s3(table_nameTEXT,column_listTEXT,optTEXT,bucketTEXT,pathTEXT,regionTEXT,OUTrowsINTEGER)AS$$DECLAREcopy_stmtTEXT;BEGINcopy_stmt:='COPY '||table_name;IFcolumn_listISNOTNULLANDlength(column_list)>0THENcopy_stmt:=copy_stmt||' ('||column_list||')';ENDIF;copy_stmt:=copy_stmt||' FROM '''||path||''' WIT...| Padraic Renaghan
AWS RDS and Aurora database services have a very useful enhancement, aws_s3.query_export_to_s3, to the native PostgreSQL COPY command, that lets you export from RDS/Aurora directly to AWS S3 storage. This can be very useful for moving data around, including exporting from Production with masking functions to land in test/debugging environments. There are currently some limitations around cross-account exporting, like you can’t. But simple workarounds to post-process the files to either reke...| Padraic Renaghan
Protecting your data is important, both from people who want to get it, and more importantly from your own staff who are only-human, and thus subject to making mistakes. Using Production data can be very useful for all kinds of debugging and testing. But you don’t want to expose confidential production data to staff that do not need to see it. Masking is useful as you export data from Production and load it into your test/debug environment.| renaghan.com