简体   繁体   中英

AWS containerised apps and database on same Redshift cluster

I a simple question for someone with experience with AWS but I am getting a little confused with the terminology and know how to proceed with which node to purchase.

At my company we currently have aa postgres db that we insert into continuously. We probably insert ~ 600M rows at year at the moment but would like to be able to scale up. Each Row is basically a timestamp and two floats, one int and one enum type.

So the workload is write intensive but with also constant small reads. (There will be the occasional large read)

There are also two services that need to be run (both Rust based)

1, We have a rust application that abstracts the db data allowing clients to access it through a restful interface.

2, We have a rust app that gets the data to import from thousands on individual devices through modbus) These devices are on a private mobile network. Can I setup AWS cluster nodes to be able to access a private network through a VPN?

We would like to move to Amazon Redshift but am confused with the node types

Amazon recommend choosing RA3 or DC2

If we chose ra3.4xlarge that means you get one cluster of nodes right?

Can I run our rust services on that cluster along with a number of Redshift database instances? I believe AWS uses docker and I could containerise my services easily I think.

Or am I misunderstanding things and when you purchase a Redshift cluster you can only run Redshift on this cluster and have to get a different one for containerised applications, possibly an ec2 cluster?

Can anyone recommend a better fit for scaling this workload?

Thanks

I would not recommend Redshift for this application and I'm a Redshift guy. Redshift is designed for analytic workloads (lots or reads and few, large writes). Constant updates is not what it is designed to do.

I would point you to Postgres RDS as the best fit. It has a Restful API interface already. This will be more of the transactional database you are looking for with little migration change.

When your data get really large (TB+) you can add Redshift to the mix to quickly perform the analytics you need.

Just my $.02

Redshift is a Managed service, you don't get any access to it for installing stuff, neither is there a possibility of installing/running any custom software of your own

Or am I misunderstanding things and when you purchase a Redshift cluster you can only run Redshift on this cluster

Yes, you don't run stuff - AWS manages the cluster and you run your analytics/queries etc.

have to get a different one for containerised applications, possibly an ec2 cluster?

Yes, you could possibly make use of EC2, running the orchestrators on your own, or make use of ECS/Fargate/EKS depending on your budget/how skilled your members are etc

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM