DCD-621 parallelize builds * DCD-621: Parallelizes tests ny invoking the parralelize script * DCD-621: Removes leading / to start jira command in product_startup role * DCD-621: Runs a scenario by name instead of using the all parameter * DCD-621: A better find all scenarios function to ensure we look at relevant molecule scenarios. Better logging too * DCD-510: Parallelizes tests further, running in 26 batches on 1 each. The pipeline file is now generated using Jinja * DCD-510: Adds a pre check stage to verify if number of batches match actual number of scenarios. This test will fail (deliberately) * DCD-510: Adds a pre check stage to verify if number of batches match actual number of scenarios. Fixes 'test' failure * DCD-590: Adds note to development.md on how to generate a pipeline file. Adds a makefile and updates a few script commands * DCD-590: Better documentation in development README document. Updates YML with instructions on how to fix issues with the prevalidation stage Approved-by: Steve Smith <ssmith@atlassian.com> Approved-by: Ben Partridge <bpartridge@atlassian.com>
Atlassian Data-Center Installation Automation
Introduction
This repository is a suite of Ansible roles, playbooks and support scripts to automate the installation and maintenance of Atlassian Data Center products in cloud environments.
Usage
Cloud DC-node deployment playbooks
The usual scenario for usage as part of a cloud deployment is to invoke the script as part of post-creation actions invoked while a new product node is being brought up. In the case of AWS, this would usually be done by cfn-init/user-data. For example, the Jira quickstart template creates a per-node launch configuration that fetches this repository and runs the appropriate AWS/product playbook, which invokes the appropriate roles.
In practice, the Ansible roles require some information about the infrastructure
that was deployed (e.g. RDS endpoint/password). The way this is currently
achieved (on AWS) is that have the CloudFormation template dump this information
into the file /etc/atl as RESOURCE_VAR=<resource> lines. This can be then
sourced as environment variables to be retrieved at runtime . See the
helper-script bin/ansible-with-atl-env and the corresponding
groups_vars/aws_node_local.yml var-file.
Maintenance playbooks
(None currently; TBW)
Development
Development philosophy
The suite is intended to consist of a number of small, composable roles that can be combined together into playbooks. Wherever possible the roles should be platform-agnostic as possible, with platform-specific functionality broken out into more specific roles.
Where possible the roles are also product-agnostic (e.g. downloads), with more specific functionality added in later product-specific roles.
Roles should be reasonably self-contained, with sensible defaults configured in
<role>/defaults/main.yml and overridden by the playbook at runtime. Roles may
implicitly depend on variables being defined elsewhere where they cannot define
them natively (e.g. the jira_config role depends on the atl_cluster_node_id
var being defined; on AWS this is provided by the aws_common role, which
should be run first).
Development and testing
See Development for details on setting up a development environment and running tests.
Ansible layout
- Helper scripts are in
bin/. In particular thebin/ansible-with-atl-envwrapper is of use during AWS node initialisation. See Usage above for more information. - Inventory files are under
inv/. For AWScfn-initthe inventoryinv/aws_node_localinventory is probably what you want. - Note that this expects the environment to be setup with infrastructure information; see Usage above.
- Global group vars loaded automatically from
group_vars/<group>.yml. In particular notegroup_vars/aws_node_local.ymlwhich loads infrastructure information from the environment. - Roles are under
roles/ - Platform specific roles start with
<platform-shortname>_..., e.g.roles/aws_common/. - Similarly, product-specific roles should start with
<product>_....
License
Copyright © 2019 Atlassian Corporation Pty Ltd. Licensed under the Apache License, Version 2.0.