Nig*_*Fds 20 continuous-integration bitbucket amazon-web-services continuous-deployment aws-codepipeline
我想将Bitbucket的代码集成到AWS Code Pipeline中.我无法找到相同的例子.我的源代码是.Net.有人可以指导我.谢谢.
Kir*_*iya 15
您可以使用调用AWS API Gateway的webhook来集成Bitbucket和AWS CodePipeline,后者调用Lambda函数(调用CodePipeline).有一个AWS博客可以引导您:将Git与AWS CodePipeline集成
ade*_*sin 11
BitBucket有一个PipeLines可以将代码部署到AWS服务的服务.使用Pipelines打包并将更新从主分支推送到连接到的S3存储桶CodePipeline
注意:
您必须PipeLines在存储库中启用
PipeLines需要一个名为的文件bitbucket-pipelines.yml,该文件必须放在项目中
确保在BitBucket管道UI中设置帐户AWS_ACCESS_KEY_ID和AWS_SECRET_ACCESS_KEY.这附带一个加密选项,因此一切都安全可靠
下面是一个bitbucket-pipelines.yml将名为DynamoDb的目录的内容复制到S3存储桶的示例
pipelines:
branches:
master:
- step:
script:
- apt-get update # required to install zip
- apt-get install -y zip # required if you want to zip repository objects
- zip -r DynamoDb.zip .
- apt-get install -y python-pip
- pip install boto3==1.3.0 # required for s3_upload.py
# the first argument is the name of the existing S3 bucket to upload the artefact to
# the second argument is the artefact to be uploaded
# the third argument is the the bucket key
- python s3_upload.py LandingBucketName DynamoDb.zip DynamoDb.zip # run the deployment script
Run Code Online (Sandbox Code Playgroud)
以下是Python上载脚本的工作示例,该脚本应与bitbucket-pipelines.yml项目中的文件一起部署.上面我命名了我的Python脚本s3_upload.py:
from __future__ import print_function
import os
import sys
import argparse
import boto3
from botocore.exceptions import ClientError
def upload_to_s3(bucket, artefact, bucket_key):
"""
Uploads an artefact to Amazon S3
"""
try:
client = boto3.client('s3')
except ClientError as err:
print("Failed to create boto3 client.\n" + str(err))
return False
try:
client.put_object(
Body=open(artefact, 'rb'),
Bucket=bucket,
Key=bucket_key
)
except ClientError as err:
print("Failed to upload artefact to S3.\n" + str(err))
return False
except IOError as err:
print("Failed to access artefact in this directory.\n" + str(err))
return False
return True
def main():
parser = argparse.ArgumentParser()
parser.add_argument("bucket", help="Name of the existing S3 bucket")
parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
parser.add_argument("bucket_key", help="Name of the S3 Bucket key")
args = parser.parse_args()
if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):
sys.exit(1)
if __name__ == "__main__":
main()
Run Code Online (Sandbox Code Playgroud)
这是一个只有一个Source阶段的示例CodePipeline (您可能想要添加更多):
Pipeline:
Type: "AWS::CodePipeline::Pipeline"
Properties:
ArtifactStore:
# Where codepipeline copies and unpacks the uploaded artifact
# Must be versioned
Location: !Ref "StagingBucket"
Type: "S3"
DisableInboundStageTransitions: []
RoleArn:
!GetAtt "CodePipelineRole.Arn"
Stages:
- Name: "Source"
Actions:
- Name: "SourceTemplate"
ActionTypeId:
Category: "Source"
Owner: "AWS"
Provider: "S3"
Version: "1"
Configuration:
# Where PipeLines uploads the artifact
# Must be versioned
S3Bucket: !Ref "LandingBucket"
S3ObjectKey: "DynamoDb.zip" # Zip file that is uploaded
OutputArtifacts:
- Name: "DynamoDbArtifactSource"
RunOrder: "1"
LandingBucket:
Type: "AWS::S3::Bucket"
Properties:
AccessControl: "Private"
VersioningConfiguration:
Status: "Enabled"
StagingBucket:
Type: "AWS::S3::Bucket"
Properties:
AccessControl: "Private"
VersioningConfiguration:
Status: "Enabled"
Run Code Online (Sandbox Code Playgroud)
可以在此处找到对此Python代码以及其他示例的引用: https://bitbucket.org/account/user/awslabs/projects/BP
Oll*_*nja 10
跟进任何人现在发现这个:
AWS CodeBuild现在支持Atlassian Bitbucket Cloud作为源类型,使其成为现有支持源的第四个:AWS CodeCommit,Amazon S3和GitHub.
这意味着您不再需要像@ Kirkaiya与Bitbucket集成的链接中所建议的那样实现lambda函数 - 它仍然是一个有效的解决方案,具体取决于您的用例或者您正在与非云版本的Bitbucket集成.
发表于AWS博客2017年8月10日 - https://aws.amazon.com/about-aws/whats-new/2017/08/aws-codebuild-now-supports-atlassian-bitbucket-cloud-as-a-源型/
如果您正在寻找一种使用AWS CodePipeline将源作为位桶的构建部署过程自动化而又不使用lambda的方法,请执行以下步骤。
注意-1。为了创建一个webhook,您需要具有bitbucket管理员访问权限,因此从提交到部署的过程是完全自动化的。2.从现在开始(19年4月),CodeBuild不支持在Pull请求合并中使用webhook。如果需要,您可以创建触发器,该触发器每天都会触发代码构建。
您还可以创建触发器以定期构建代码https://docs.aws.amazon.com/codebuild/latest/userguide/trigger-create.html
更新-(June'19) -CodeBuild现在支持PR_Merge的Pull Request构建。参考:https : //docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html#sample-bitbucket-pull-request-filter-webhook-events。
@binary 答案的替代方法,以及对@OllyTheNinja 答案的澄清:
简而言之:让 CodeBuild 监听 Bitbucket 的 Webhook 并写入 S3 对象。在管道中监听后者的更新事件。
在 AWS 代码套件中
定义一个 CodeBuild 项目,使用
定义管道:
| 归档时间: |
|
| 查看次数: |
17212 次 |
| 最近记录: |