# Survey of Data Synchronization Tools
## Requirements
* Data synchronisation
* AAA
* Messaging / Notifications
## Necessary Steps for Comparison of Tools and Frameworks
* Requirementsanalysis
* proper Interface-Definition
Easy way to go is to search for surveys!
* Tutorials / Intros / Documentation
* Who is using it?
* **Community / Help** How many hits do you find on this topic? Look for Google Analytics and the trend also e.g. StackOverflow Questions. Is there an active community to ask for help?
* Changelogs / Development
* Learning to use the tool
* Init and Prerequisites
* Documentation and research
* Coding
* Testing
* Deployment
First search for tools or frameworks which will compare to firebase [firestore]. Form teams of at least five participants and check the steps and describe the chosen tool.
## Couchbase
_"Couchbase is the modern database for enterprise applications.
Couchbase is a distributed document database with a powerful search engine and in-built operational and analytical capabilities. It brings the power of NoSQL to the edge and provides fast, efficient bidirectional synchronization of data between the edge and the cloud.
Find the documentation, samples, and references to help you use Couchbase and build applications."_ [couchbase-doc]
**Team:** Mira Haselberger, _Michaela Heinzel_, Maria Ottendorfer, Julian Schwarz, Carmen Dimov, Martin Jindra
__here comes also a overview__
where does it come from ... how is it structured ... what are the main parts (especially the sync implementation) ... have you found the white paper in the docs?
https://resources.couchbase.com/c/server-arc-overview?x=V3nd_e#zoom=100
### Tutorials / Intros / Documentation
_"The Couchbase Autonomous Operator provides native integration of Couchbase Server with open source Kubernetes and Red Hat OpenShift. It enables you to automate the management of common Couchbase tasks such as the configuration, creation, scaling, and recovery of Couchbase clusters. By reducing the complexity of running a Couchbase cluster, it lets you focus on the desired configuration and not worry about the details of manual deployment and life-cycle management."_ [couchbase-intro]
The introductions also lists four Getting Started-Tutorials.
* [Installing the Operator on Kubernetes](https://docs.couchbase.com/operator/current/install-kubernetes.html)
* [Installing the Operator on Red Hat OpenShift](https://docs.couchbase.com/operator/current/install-openshift.html)
* [Deploying a Couchbase cluter](https://docs.couchbase.com/operator/current/howto-couchbase-create.html)
* [Connecting to the Couchbase web console](https://docs.couchbase.com/operator/current/howto-ui.html)
It even describes how the documentation is organized by:
* Getting Started
* Learn
* Manage
* Reference
* Tutorials
#### Tutorials
A larger collection of tutorials can be found under [couchbase-tutorials], which consists of a variety of hand-on tutorials which can be filtered by Roles, Languages and Level (Beginner, Intermediate, Advanced).
To get started with Couchbase Server CE, there is a specific tutorial in which the "*project explains how to get started with Couchbase Community Edition, and briefly mentions all steps to do it, as well as the reasons or elements behind those.*" - [couchbase-gs-ce]
#### Documentation
https://docs.couchbase.com/home/index.html
### Who is using it?
- Doodle: Doddle uses Couchbase Lite and Sync Gateway to provide retailers with easy solutions for click and collect, ship from store, and returns.
- Domino's Pizza: Creating personalized marketing campaigns with unified real-time data.
- LinkedIn: LinkedIn chose Couchbase for its ease of use and extremely low latency, and now uses Couchbase for over 50 use cases companywide.
- PayPal: PayPal manages over 1B documents and 10TB of data with Couchbase while processing millions of user analytics updates per minute.
- Pfizer: Pfizer deployed Couchbase on the AWS cloud in order to guarantee high performance and flexibility for dozens of healthcare applications.
- Ryanair: Ryanair uses Couchbase Mobile’s embedded database and integrated sync to cut travel booking time from 5 minutes to 30 seconds.
- Tesco: Tesco, the world’s third-largest retailer, uses Couchbase to easily scale its catalog and inventory management for millions of customers.
### Community / Help
Stackoverflow: 3,748 questions for Couchbase (as tag)
Couchbase has a community support with Couchbase Forums and Free Training. The forum is showing a very recent activity by many users where most of the times at least one answer is given. [couchbase-forum]
### Changelogs / Development
Changelogs/Release Notes for Couchbase Server and Couchbase-lite can be accessed in Ressources [Couchbase-server-changelog] and [Couchbase-lite-changelog] respectively.
After an excessive search it was not possible for me to find a public roadmap for Couchbase. They are announcing Versions shortly before they get released, but a real Roadmap for "what is planned" is nowhere to be found.
### Init and Prerequisites
Because of the different versions of Linux Distros, Windows and MacOS setting up Couchbase in Docker is the one of the most consistent [docker-installation]
1. To run an instance of Couchbase in Docker run following command.
```
docker run -d --name couchbase -p 8091-8094:8091-8094 -p 11210:11210 -v couchbase_data:/opt/couchbase/var couchbase
```
2. Visit the setup guide in the browser `http://localhost:8091/`.
3. Either setup a new cluster or join one. Setup an user with a password, accept the terms and choose `Finish with defaults`.
To get access to the shell you have to connect to the container.
```
docker exec -it couchbase cbq -u Administrator
```
Then you will be asked for the password.
### Documentation and Research
### Coding
### Testing
### Deployment
## CouchDB
_"Apache CouchDB is one of a new breed of database management systems. This topic explains why there’s a need for new systems as well as the motivations behind building CouchDB.
As CouchDB developers, we’re naturally very excited to be using CouchDB. In this topic we’ll share with you the reasons for our enthusiasm. We’ll show you how CouchDB’s schema-free document model is a better fit for common applications, how the built-in query engine is a powerful way to use and process your data, and how CouchDB’s design lends itself to modularization and scalability."_ [couchdb-doc]
Architecture of CouchDB:
The client generally uses the **HTTP API** to interact with CouchDB. HTTP Requests go to the **CouchDB Engine**, which represents the core of the system. It is responsible for managing the storing of internal data, documents and views. It does that by using B-tree and in it data is accessed by a key or a key range. It communicates with multiple **replica databases**, which are storing the actual data in **documents**. More precisely the data is stored in JSON and JavaScript (MapReduce) is used for quering the data. [couchdb-architecture]
Features of CouchDB: [couchdb-architecture]
- "***Replication:** It provides the simplest form of replication and no other database is so simple to replicate.*"
- "***Document Storage:** It is a NoSQL database that follows document storage where each field is uniquely named and contains values of various data types such as text, number, Boolean, lists, etc.*"
- "***ACID Properties:** The CouchDB file layout follows all the features of ACID properties.*"
- "***Security:** It also provides database-level security and the permissions are divided into readers and admins where readers can do both the read and write to the database.*"
- "***Map/Reduce:** The main reason for the popularity of CouchDB is a map/reduce system.*"
- "***Authentication:** CouchDB facilitates you to keep authentication open via a session cookie-like a web application.*"
- *"**Built for Offline:** CouchDB can replicate to devices like smartphones that have a feature to go offline and handle data sync for you when the device is back online."*
- *"**Eventual Consistency:** CouchDB guarantees eventual consistency to provide both availability and partition tolerance."*
- *"**HTTP API:** All items have a unique URI(Unique Resource Identifier) that gets exposed via HTTP. It uses the HTTP methods like POST, GET, PUT, and DELETE for the four basic CRUD (Create, Read, Update, Delete) operations on all resources."*
Advantages of CouchDB: [couchdb-architecture]
1. "*HTTP API is used for easy Communication.*"
2. "*It is used to store any type of data.*"
3. "*ReduceMap allows optimizing the combining of data.*"
4. "*Structure of CouchDB is very simple*"
5. "*Fast indexing and retrieval.*"
**Team:** _Geyer_, Katadzic, Patlidzanovic, Malburg, Grubmüller
### Tutorials / Intros / Documentation
#### Tutorial
There is a Tutorial on the offical Website of CouchDB [couch-db] with the Title of 'Getting Started'. It focuses on CouchDB's features and starts with creating a Database. Subsequently it also guides through the experimantation with CouchDB views.[couch-db-tutorial]
#### Intro
A brief introduction of CouchDB [couch-db-intro]
#### Documentation
The complete HTTP API divides itself in following themes:
- Server API [couch-db-server-api]
- Database API [couch-db-database-api]
- Document API [couch-db-document-api]
- Replication API [couch-db-replication-api]
### Who is using it?
- Amadeus IT Group, for some of their back-end systems
- Credit Suisse, for internal use at commodities department for their marketplace framework
- Meebo, for their social platform (Web and applications)
- npm, for their package registry
- Sophos, for some of their back-end systems
- The BBC, for its dynamic content platforms
- CANAL+ for international on-demand platform at CANAL+ Overseas.
- Protogrid, as storage back-end for their rapid application development framework
[couchdb-wikipedia]
### Community / Help
#### GitHub
There is a Github repository for CouchDB with around 167 contributers. Furthermore there are around 1.000 solved Issues but it seems that the current activtiy is not that high. There are still around 330 open Issues. (Last checked: 17.01.2022) [couchdb-github]
#### Stackoverflow
Fortunately there is an active community for couchdb support on Stackoverflow. There are around 6.000 questions with most of them already answered. (Last checked: 17.01.2022) [couchdb-stackoverflow]
Overall the couchdb community seems to be very big and fairly active.
### Changelogs / Development
**Latest feature release: 3.2.0 (published on 2021-10-12)** [couch-db-blog]
No major changes.
**Next up: 4.x - branch**, which is in development right now. This release plans some big changes, but none that will completely change the application.
### Init and Prerequisites
***Time to set couchdb up with docker 10min 42sec***
The following links are used to install couchdb, setup a cluster and change the server configuration if needed:
* Installing Couchdb, for example in unix systems or docker[couchdb-install]
* Seting up a cluster, either a singular node or a multi node configuration [couchdb-setup]
* Configuring [couchdb-configuration]
If the necesseray steps are completed the server/couchdb should be recheable by `http://127.0.0.1:5984` or `http://<Server-IP>:5984`, if the default port was not changed. This port handles all HTTP API Requests. Because specific endpoints are protected by authentication, a user has to be created and configured, if not already done, for this use the following tutorial [couchdb-authentication].
To check if the server was configured use the following curl command: `curl http://127.0.0.1:5984/`. Change the IP accordingly to your server ip. If done correctly, you should see the following response (some fields value may differ):
```
{
"couchdb": "Welcome",
"version": "3.0.0",
"git_sha": "83bdcf693",
"uuid": "56f16e7c93ff4a2dc20eb6acc7000b71",
"features": [
"access-ready",
"partitioned",
"pluggable-storage-engines",
"reshard",
"scheduler"
],
"vendor": {
"name": "The Apache Software Foundation"
}
}
```
### Documentation and Research
There is a thorough documentation for couchdb on their official website. [couchdb-docs]
A lot / all aspects which are necessary to use CouchDB are included:
- Installation
- Set-Up
- Configuration
- Cluster Management
- Maintenance
- Fauxton
- Experimental Features
Furthermore, there are references to look at for:
- API
- JSON Structure
- Query Server
- Partitioned Databasees
There are even quick references for:
- API
- Configuration
### Coding
In order to use CouchDB in your code you have to use the REST-Interface.
You can limit the access from random users with authenication which is described here [couchdb-authenication].
Generally you can get started with this article [couchdb-gettingStarted]
And if you want to have detailed documentation about the API you can look at this page[couchdb-api]
### Testing
Because you communicate with couchDB through the REST Interface you can create unit tests using that.
Best practise is to create a new database if you test.
### Deployment
#### Windows
The installation guide for windows is as follows: [couch-db-install-windows]
1. Get the latest Windows binaries from the CouchDB web site. Old releases are available at archive.
2. Follow the installation wizard steps. Be sure to install CouchDB to a path with no spaces, such as C:\CouchDB.
3. Your installation is not complete. Be sure to complete the Setup steps for a single node or clustered installation.
4. Open up Fauxton
5. It’s time to Relax!
#### Docker
CouchDB can be deployed with Docker by using the image `apache/couchdb`. Additionally, the version number can be specified, e.g. `latest` for the latest release, `3` for the lates 3.x version or `2` for the latest 2.x version.
#### Kubernetes
CouchDB can be deployed on kuberentes:
```
helm repo add couchdb https://apache.github.io/couchdb-helm
helm repo update
helm install --name my-release couchdb/couchdb
```
## AWS Datasync
*AWS DataSync is a secure, online service that automates and accelerates moving data between on premises and AWS storage services. DataSync can copy data between Network File System (NFS) shares, Server Message Block (SMB) shares, Hadoop Distributed File Systems (HDFS), self-managed object storage, AWS Snowcone, Amazon Simple Storage Service (Amazon S3) buckets, Amazon Elastic File System (Amazon EFS) file systems, Amazon FSx for Windows File Server file systems, and Amazon FSx for Lustre file systems.* - [aws-doc]
**Team:** _Kopcinski_, Fraberger, Timmelmayer, Matykiewicz, Przidal
### Overview
*"AWS DataSync is an online data transfer service that simplifies, automates, and
accelerates moving data between on-premises storage systems and AWS Storage
services, as well as between AWS Storage services. This whitepaper focuses on using
DataSync in conjunction with Amazon EFS.
DataSync accelerates and automates data transfer—removing the need to modify
applications, develop scripts, or manage infrastructure. DataSync uses a purpose-built
protocol and a parallel, multi-threaded architecture to accelerate and secure data
transfer at speeds up to 10 times faster than open-source tools.
DataSync has simple, per gigabyte pricing, so that you pay only for what you use.
DataSync uses a software agent to handle transferring files and supports both one-time
migrations and periodic synchronizations of data between on-premises file systems and
Amazon EFS.
DataSync provides end-to-end security, including encryption and integrity validation, to
ensure your data arrives securely, intact, and ready to use. It also supports VPC
endpoints, providing you with the option to transfer data without traversing the public
internet, further increasing the security of data copied online."* [aws-overview]
### Tutorials / Intros / Documentation
In order to set a simple AWS Datasync instance you can follow an easy tutorial by AWS: [aws-tutorial].
Furthermore there are multiple online Tutorials e.g. [aws-tut]
### Who is using it?
#### Autodesk
*Autodesk makes software for people who make things. If you’ve ever driven a high-performance car, admired a towering skyscraper, used a smartphone, or watched a great film, chances are you’ve experienced what millions of Autodesk customers are doing with their software. Autodesk successfully migrated over 700 terabytes of data from their on-premises Dell EMC Data Domain storage system to Amazon S3, and did it swiftly and effortlessly using AWS DataSync.
"Our petabyte scale data migration journey from on-premises to AWS was accomplished swiftly with minimal effort and was completely self-managed with AWS DataSync. This solution is a game changer!"* [aws-customers]
#### Deluxe Entertainment
*Deluxe, a world-leading video creation and distribution company, is working with AWS to digitally transform their movie business. Deluxe wanted to speed up their media distribution process and improve data security while also reducing costs and eliminating errors. Their Digital Cinema Network uses AWS Snowcone to connect the space-constrained movie theater environment to content stored in Amazon S3, and uses AWS DataSync to accelerate online distribution of movies and advertising to theaters in 38 key media markets worldwide. This solution allows Deluxe to reduce reliance on error-prone, costly, and manual media distribution, while improving data security. It also enables their studio and theater customers to respond faster to local market preferences so they can increase their revenue streams.* [aws-customers]
#### FORMULA 1
*FORMULA 1 racing began in 1950 and is the world’s most prestigious motor racing competition, as well as the world’s most popular annual sporting series.
“We at FORMULA 1 used both AWS DataSync and AWS Storage Gateway services to provide disaster recovery for various data streams. [...] We chose AWS DataSync as it allows us to sync our current data, totaling 400 TB, to Amazon S3 in a way we could control and it enables us to keep any new data in sync. With AWS DataSync we were able to provide DR by syncing data from our on-premises file server to Amazon S3 at an average of 4 TB per day, and did so with encryption during transfer and at rest. The settings and controls provided by AWS DataSync allow us to control when and how much data we want to sync without harming any other workloads."* [aws-customers]
#### Chan Zuckerberg Biohub
*Chan Zuckerberg Biohub actively nurtures and creates opportunities for leaders in science and technology to come together and drive discovery, setting the standard for collaborative science. Chan Zuckerberg Biohub conducts research that helps solve big health problems. By sparking collaborative, interdisciplinary work, these leaders empower the pursuit of intuition and the opportunity to explore the next questions — and answers.
"We have genome sequencers running daily and need to transfer files a minimum of two times each day from on premises to Amazon S3 in an accelerated and efficient manner, in order for us to analyze and process the sequencer data in AWS. AWS DataSync serves this purpose perfectly as we are able to easily schedule and automate the file transfer process, avoiding the need for costly commercial transfer tools. [...]"* [aws-customers]
#### Takara Bio Inc.
*Takara Bio uses AWS DataSync to accelerate data transfers to their genomics data analysis service on AWS. Takara Bio is a biotechnology business subsidiary of TAKARA SHUZO CO.,LTD. (currently TAKARA HOLDINGS INC.). Based on the corporate philosophy of “contributing to human health through the development of innovative biotechnologies such as genetherapy."
"The combination of fast and secure data transfer by AWS DataSync and the large storage capacity of Amazon S3 allowed us to smoothly deliver genome data, which is becoming increasingly large. We will continue to build an analysis platform using AWS for personalized medicine using whole genome sequencing, which is expected to increase in the future."* [aws-customers]
### Community / Help
If you have quick questions AWS has an active community on the [official forum](https://repost.aws/), where questions usually get answered within a few hours. AWS also has quite a big Stack Overflow community with over 500 answered questions. Additionaly there is an official AWS Github account with over 300 publicly available repositories.
### Changelogs / Development
Latest API Version: datasync-2018-11-09
Latest documentation update: July 28, 2021
| **Change** | **Description** | **Date** |
|---|---|---|
| Support for Amazon FSx for Lustre file systems | AWS DataSync can now transfer files and folders to and from FSx for Lustre file systems. For more information about FSx for Lustre locations, see Creating a location for FSx for Lustre. | December 10, 2021 |
| Support for Hadoop Distributed File Systems (HDFS) | AWS DataSync now supports transferring files and folders to and from HDFS clusters. For more information about HDFS locations, see Creating a location for HDFS. | November 3, 2021 |
| New AWS Region | AWS DataSync is now available in the Asia Pacific (Osaka) Region. For more information, see AWS DataSync regions in the AWS General Reference. | July 28, 2021 |
| Fully automated transfers between AWS storage services | AWS DataSync can now transfer files or objects between Amazon S3, Amazon EFS, or FSx for Windows File Server, with just a few clicks in the DataSync console. For more information, see Data transfer between AWS storage services. | November 9, 2020 |
| Adjusting the network bandwidth used by a running task | AWS DataSync now enables customers to adjust the network bandwidth used by a running DataSync task. This helps to minimize impact on other users or applications when a task spans multiple days. For more information, see Adjusting bandwidth throttling for a task execution. | November 9, 2020 |
### Init and Prerequisites
#### Init Setup
First of all you need to sign up for AWS. You can sign up at [AWS signup](https://portal.aws.amazon.com/billing/signup.). Then you follow the online instructions.
Now you can access [DataSync management console](console.aws.amazon.com/datasync/home) in order to perform various sync configuration and management tasks. Additionally, you can use the AWS DataSync API or the AWS CLI to programmatically configure and manage DataSync. For more information about the API, see [API Reference](https://docs.aws.amazon.com/datasync/latest/userguide/API_Reference.html).
You can also use the AWS SDKs to develop applications that interact with DataSync. The AWS SDKs for Java, .NET, and PHP wrap the underlying DataSync API to simplify your programming tasks. For information about downloading the SDK libraries, see [Sample code libraries](https://aws.amazon.com/developer/).
#### Requirements
AWS-Datasync has two kinds of requirements Agent and network requirements.
##### Agent requirements
You run DataSync on-premises as a virtual machine (VM).
DataSync supports the following hypervisor versions and hosts:
* VMware ESXi Hypervisor (version 6.5, 6.7, or 7.0)
* Microsoft Hyper-V Hypervisor
* Linux Kernel-based Virtual Machine (KVM)
* Amazon EC2 instance
###### Virtual machine requirements
When deploying AWS DataSync on-premises, make sure that the underlying hardware where you deploy the DataSync VM can dedicate the following minimum resources:
* Virtual processors – Four virtual processors assigned to the VM.
* Disk space – 80 GB of disk space for installation of VM image and system data.
* RAM – Depending on your configuration, one of the following:
* 32 GB of RAM assigned to the VM, for tasks that transfer up to 20 million files.
* 64 GB of RAM assigned to the VM, for tasks that transfer more than 20 million files.
More detailed information to the agent requirements are available under [aws-agentReq]
##### Network requirements
Using DataSync to transfer your data requires access to certain network ports and endpoints.
There are following defined network requirements:
* Network requirements to connect to your self-managed storage
* Network requirements when using VPC endpoints
* Network requirements when using public service endpoints or FIPS endpoints
Further detailes on these requirements are available under [aws-networkReq]
### Documentation and Research
Under the following link you can find general information about AWS-Datasync: [aws-docs].
On this site there are the following points:
* Use cases
* Benefits
* Additional AWS DataSync resources
On the page you will find a link to a PDF: [aws-information]. In this PDF you will find the following points:
* What is AWS Data Sync?
* How AWS DataSync works
* Setting up
* Requirements
* Getting started
* Using the AWS CLI
* Monitoring your task
* Working with tasks
* Working with tasks executions
* Working with locations
* Working with agents
* Using the VM local console
* Security
* Troubleshooting
* Troubleshooting an EC2 agent
* DataSync quotas and limits
* Additional resources
* API Reference
* Document history
### Coding
Most of the required AWS Datasync operations can be done using the **AWS Command Line Interface**.
*You can use these commands to create an agent, create source and destination locations, and run a task.
Before you begin, we recommend reading How AWS DataSync works to understand the components and terms used in DataSync and how the service works. We also recommend reading Using identity-based policies (IAM policies) for DataSync to understand the AWS Identity and Access Management (IAM) permissions that DataSync requires.
Before you use AWS CLI commands, install the AWS CLI. For information about how to install the AWS CLI, see Installing the AWS Command Line Interface in the AWS Command Line Interface User Guide. After you install the AWS CLI, you can use the help command to see the DataSync operations and the parameters associated with them.* [aws-datasync-how-it-works][aws-cli-install][aws-cli-guide]
To see the available operations, enter the following command. [aws-cli-guide]
```aws datasync help```
To see the parameters associated with a specific operation, enter the following command. [aws-cli-guide]
```aws datasync operation help```
#### API Reference
*"In addition to using the console, you can use the AWS DataSync API to programmatically configure and manage DataSync and its resources. This section describes the AWS DataSync operations and data types and contains the API Reference documentation for AWS DataSync."* [aws-api-ref]
##### Topics
* Actions
* Data Types
* Common Errors
* Common Parameters
### Testing
### Deployment
To deploy an DataSync Agent to VMWare follow these steps:
1. Open the AWS DataSync console at https://console.aws.amazon.com/datasync/
2. If you don't have an agent, on the Create agent page in the console, choose Download image in the Deploy agent section. Doing this downloads the agent and deploys it in your VMware ESXi hypervisor. The agent is available as a VM. If you want to deploy the agent as an Amazon EC2 instance, see Deploy your agent as an Amazon EC2 instance to access in-cloud file systems. AWS DataSync currently supports the VMware ESXi hypervisor. For information about hardware requirements for the VM, see Virtual machine requirements. For information about how to deploy an .ova file in a VMware host, see the documentation for your hypervisor. If you have previously activated an agent in this AWS Region and want to use that agent, choose that agent and choose Create agent. The Configure a source location page appears.
3. Power on your hypervisor, log in to your VM, and get the IP address of the agent. You need this IP address to activate the agent.
[aws-datasync-vmware]
## Parse
*Parse is the most used open-source framework to develop application backends. It helps developers to accelerate app development and reduces the total amount of effort required to build an app. A large community of engaged developers supports the platform and has been evolving it since 2016. It’s a great tool to develop apps quickly and under an affordable budget.* - [parse]
**Team:** Resch, _Hobor_, Potok, ?, ?
__here comes also a overview__
Hobor
The Parse Platform consists of Parse Server and a range of adapters. Also have a whole host of client SDKs including JavaScript, iOS, Android, PHP and the new additions of a pure Swift SDK and Flutter SDK. [parse-com]
Parse builds applications faster with object and file storage, user authentication, push notifications, dashboard an more.
Parse enable groups to quickly set up a collective, raise funds and manage them transparently.[parse-intro]
The team consists of four members.
### Tutorials / Intros / Documentation
Hobor
On the homepage of Parse there is a own section which is called "Documentation", where you can learn more about deploying a Parse Server, or learn about the client SDK. [parse-plattform]
#### Parse Server Guide
Parse Server is an open source backend that can be deployed to any infrastructure that can run Node.js.[parse-server-guide]
#### Client SDK Guides
Here they have comprehensive guides for each platform. You can also take a look at the detailed API references and tutorials.[parse-plattform]
### Who is using it?
Potok
Companies that reportedly use **Parse** in their tech stacks: [stackshare-parse]
* Accenture
* Iziwork
* Weebly
* Bubble
* Lifesum
* Etc.
### Community / Help
Resch
On the homepage of Parse there is a own section which is called "Help & Communication", where you can either visit Stack Overflow, Community Forum or GitHub.[parse-plattform]
#### GitHub
On GitHub are 50 Repositorys and 15 Members. It seems to be up to date, as updates are made regularly. Also for example at the repository _parse-community/parse-dashboard_ there are 784 closed and only 62 open issues. Furthermore the issues appear to be worked on because there are only a few old ones. [parse-github]
#### Stack Overflow
Stack Overflow has a total of 22,122,944 questions and 6,711,285 of those questions with no upvoted or accepted answers. It seems to be not as active as GitHub because many questions need a long time to get answered
#### Community Forum
On the website of Parse there is also a Community Channel which appeares to be active but not as active as on GitHub. It includes lesser questions and is also not as clearly arranged then on GitHub.
### Changelogs / Development
Resch
✅ **Stable Releases**
Details:
Stability: stable
NPM channel: @latest
Branch: release
Purpose: official release
Suitable environment: production
⚠️ **Beta Releases**
Details:
Stability: pretty stable
NPM channel: @beta
Branch: beta
Purpose: feature maturation
Suitable environment: development
🔥 **Alpha Releases**
Details:
Stability: unstable
NPM channel: @alpha
Branch: alpha
Purpose: product development
Suitable environment: experimental
### Init and Prerequisites
Hobor
* Node 8 or newer [parse-plattform]
* MongoDB version 3.6 [parse-plattform]
* Python 2.x [parse-plattform]
* For deployment, an infrastructure provider like Heroku or AWS
[parse-plattform]
#### Getting Started
Potok
*The fastest and easiest way to get started is to run MongoDB and Parse Server locally. Use the bootstrap script to set up Parse Server in the current directory.* [parse-server-guide]
```bash
sh <(curl -fsSL https://raw.githubusercontent.com/parse-community/parse-server/master/bootstrap.sh)
npm install -g mongodb-runner
mongodb-runner start
npm start
```
### Documentation and Research
### Coding
Hobor
To write the code you need an IDE.
Generally you can get started with this article [parse-getting-started]
You can also have a closer look at the API on this page [parse-plattform]
### Testing
### Deployment
Hobor
Parse can be hosted in any cloud that runs Node.js. The requirements are Node 4.3 and MongoDB 2.6.X, 3.0.X or 3.2.X.[parse]
#### Heroku and MongoDB Atlas
Heroku and MongoDB Atlas provide an easy way to deploy Parse Server. Parse Server also can be deployed to Glitch, mLab and Back4APP. [parse-plattform]
#### mLab
mLab provides a Database-as-a-Service for MongoDB. After creating a user that has access to connect to construct a MongoDB connection: `mongodb://yourusername:yourpassword@yourmlabdatabaseaddress.mlab.com:yourdatabaseport/yourdatabasename`.
#### Glitch
Glitch provides an easy way to instantly create and deploy Nodde.js applications for free. When your App is deployed you should see: `parse-server-example running on port 3000.`
## Resources
* [couchbase-doc] "Couchbase Documentation" [online](https://docs.couchbase.com/home/index.html)
* [couchdb-doc] "Why CouchDB?" [online](https://docs.couchdb.org/en/stable/intro/why.html)
* [firestore] "Cloud Firestore" [online](https://firebase.google.com/docs/firestore)
* [aws-doc] "AWS Datasync" [online](https://aws.amazon.com/datasync/)
*
### Parse
* [parse] "Parse - Back4App" [online](https://www.back4app.com/parse)
* [parse-plattform]"Parse Plattform"[online](https://parseplatform.org/)
* [parse-stackOverflow]"Help & Communication"[online](https://stackoverflow.com/tags/parse-platform)
* [parse-community]"Community Forum"[online](https://community.parseplatform.org/)
* [parse-github]"Parse Community"[online](https://github.com/parse-community)
* [parse-server]"Parse Comunity, Parse Server"[oline](https://github.com/parse-community/Parse-Server)
* [parse-server-guide]"Parse Server"[online](https://docs.parseplatform.org/parse-server/guide/)
* [parse-com]"Parse-Comunity/About"[online](https://opencollective.com/parse-server)
* [stackshare-parse] "Stackshare Parse" [online](https://stackshare.io/parse)
* [parse-intro] "Introduction" [online](https://docs.opencollective.com/help/about/introduction)
* [parse-getting-started] "Getting Started" [online](https://docs.parseplatform.org/parse-server/guide/#getting-started)
### AWS Datasync
* [aws-customers] "AWS Datasync Customers" [online](https://aws.amazon.com/datasync/customers/)
* [aws-history] "AWS Document History" [online](https://docs.aws.amazon.com/datasync/latest/userguide/doc-history.html)
* [aws-tutorial] "Getting started with AWS DataSync" [online](https://docs.aws.amazon.com/datasync/latest/userguide/getting-started.html)
* [aws-docs] "What is AWS DataSync?" [online](https://docs.aws.amazon.com/datasync/latest/userguide/what-is-datasync.html#first-time-user)
* [aws-information] "AWS DataSync User Guide" [online](https://docs.aws.amazon.com/datasync/latest/userguide/sync-dg.pdf#what-is-datasync)
* [aws-agentReq] "Agent requirements" [online](https://docs.aws.amazon.com/datasync/latest/userguide/agent-requirements.html)
* [aws-networkReq] "Network requirements for DataSync" [online](https://docs.aws.amazon.com/datasync/latest/userguide/datasync-network.html)
* [aws-datasync-how-it-works] "How AWS DataSync works" [online](https://docs.aws.amazon.com/datasync/latest/userguide/how-datasync-works.html)
* [aws-cli-install] "Installing or updating the latest version of the AWS CLI" [online](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html)
* [aws-cli-guide] "Using the AWS Command Line Interface with AWS DataSync" [online](https://docs.aws.amazon.com/datasync/latest/userguide/using-cli.html)
* [aws-tut] "AWS DataSync" [online](https://tutorialsdojo.com/aws-datasync/)
* [aws-overview] "Load, Store, and Protect Linux-
Based NFS Workloads in AWS" [online](https://d1.awsstatic.com/whitepapers/aws-load-store-protect-linux.pdf)
* [aws-datasync-vmware] "AWS Datasync provision agent on vmware" [online](https://docs.aws.amazon.com/datasync/latest/userguide/deploy-agents.html)
### Couchbase
* [couchbase-intro] "Introduction" [online]
* [docker-installation] "Installation of Docker" [online](https://docs.docker.com/engine/install/)
* [couchbase-installation] "Installation of Couchbase" [online](https://hub.docker.com/_/couchbase)
* [Couchbase-server-changelog] "Release Notes for Couchbase Server 7.0" [online](https://docs.couchbase.com/server/current/release-notes/relnotes.html)
* [Couchbase-lite-changelog] "Release Notes" [online](https://docs.couchbase.com/couchbase-lite/current/csharp/release-notes.html)
* [couchbase-tutorial] "Couchbase Tutorials" [online](https://docs.couchbase.com/tutorials/index.html#)
* [couchbase-gs-ce] "Getting Started with Couchbase Server Community Edition (CE)" [online](https://docs.couchbase.com/tutorials/getting-started-ce/index.html)
* [couchbase-forum] "Couchbase Forum" [online](https://forums.couchbase.com)
### Couchdb
* [couchdb-install] "Installation of Couchdb" [online](https://docs.couchdb.org/en/stable/install/index.html)
* [couchdb-setup] "Setup a cluster in Couchdb" [online](https://docs.couchdb.org/en/stable/setup/index.html)
* [couchdb-configuration] "Configuring Couchdb" [online](https://docs.couchdb.org/en/stable/config/index.html)
* [couchdb-github] "apache/couchdb" [online](https://github.com/apache/couchdb)
* [couch-db] "Official Website of CouchDB" [online](https://couchdb.apache.org/)
* [couch-db-wikipedia] "CouchDB Wikipedia" [online](https://en.wikipedia.org/wiki/Apache_CouchDB)
* [couch-db-blog] "CouchDB Blog" [online](https://blog.couchdb.org/)
* [couch-db-tutorial] "CouchDB Tutorial" [online](https://docs.couchdb.org/en/stable/intro/tour.html)
* [couch-db-intro] "CouchDB Introdcution" [online](https://docs.couchdb.org/en/stable/intro/index.html)
* [couchdb-stackoverflow] "Questions tagged [couchdb]" [online](https://stackoverflow.com/questions/tagged/couchdb)
* [couchdb-docs] "CouchDB Documentation" [online](https://docs.couchdb.org/en/stable/)
* [couch-db-server-api] "CouchDB Server HTTP API" [online](https://docs.couchdb.org/en/stable/api/server/index.html#api-server)
* [couch-db-database-api] "CouchDB Database HTTP API" [online](https://docs.couchdb.org/en/stable/api/database/index.html#api-database)
* [couch-db-document-api] "CouchDB Document HTTP API" [online](https://docs.couchdb.org/en/stable/api/document/index.html#api-document)
* [couch-db-replication-api] "CouchDB Replication HTTP API" [online](https://docs.couchdb.org/en/stable/api/server/common.html#api-server-replicate)
* [couch-db-install-windows] "CouchDB install windows" [online](https://docs.couchdb.org/en/stable/install/windows.html)
* [couchdb-authentication] "CouchDB how to use authentication for specific endpoints" [online](https://docs.couchdb.org/en/stable/intro/security.html)
* [couchdb-gettingStarted] "CouchDB Quick Quide to use API Requets" [online](https://docs.couchdb.org/en/stable/intro/tour.html)
* [couchdb-api] "CouchDB Core API Elements and how to use them" [online](https://docs.couchdb.org/en/stable/intro/api.html)
* [couchdb-architecture] "CouchDB architecture" [online](https://www.geeksforgeeks.org/introduction-to-apache-couchdb/#:~:text=CouchDB%20is%201.61.-,Architecture,data%2C%20documents%2C%20and%20views.)