-
Notifications
You must be signed in to change notification settings - Fork 824
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support deploying lambda functions with custom CDK category #9055
Comments
@shishkin Yes, the Amplify CLI does not package and upload the lambda function custom source code for you. You can use |
Thanks for clarification @kaustavghosh06 . Maybe I don't understand how Amplify integrates CDK, but this seems unfortunate since CDK can package assets. Is there a way to use custom build steps or hooks to get CDK behavior integrated into Amplify? |
Yeah, at this moment we don't support packaging assets out of the box. But you can try using Also, any specific reason you won't lean on using |
Because I need custom resources provisioned and my lambda configured with env vars pointing to them. I couldn't find a way to get Arn outputs from CDK to Amplify lambda. |
Another issue I run into with Amplify functions is that S3 triggers from Amplify Storage setup object key filters that I can't figure out how to configure. |
I have run into this issue as well when trying to use the Amplify+CDK to deploy Lambda Functions from Docker images. The CDK will perform the docker build to create the docker image on the host machine and upload it to ECR. This does not work with the Amplify+CDK. I use this method of the CDK as I am deploying Lambda functions in Swift, which are not supported by Amplify with amplify add function. Is asset packaging something that is on the roadmap? |
Found this thread after spending an afternoon trying to package up my custom lambda created in CDK. This doesn't work according to this thread. However, if we use Edit: I actually solved this. You need can reference a new Lambda.function from an ARN. I used the ARN that I created from the cdk.Fn.ref function. |
I have been using Amplify extensively and before having the ability to use a custom CDK code all my backend was typically done using AWS SAM. I recently started migrating some of my cloudformation backend that contains IoT resources, queues, layers and Lambdas to CDK. The reason is that when you have multiple lambda functions It is much more convenient to use CDK than using amplify add function. It would be tremendously beneficial to fully support CDK deployments using the Amplify custom CDK as it would make amplify controls the entire front/back-ends deploy. |
I would also like to voice my support for using |
I haven't tried it with Amplify myself, but this CDK module might do asset bundling as part of its resource synthesis and thus should not rely on CDK CLI to do bundling: https://docs.aws.amazon.com/cdk/api/latest/docs/aws-lambda-nodejs-readme.html. There is also this community construct for ESBuild: https://github.com/mrgrain/cdk-esbuild. I just switched to using plain CDK instead of Amplify CLI as I found it's more productive over the long run to learn what Amplify is doing under the hood and replicate it with CDK, than waste days over days troubleshooting Amplify's confusing error messages and work around its idiosyncrasies. Some resources I've found for that: https://serverless-nextjs.com/docs/cdkconstruct/ to replicate Hosting and https://github.com/bobbyhadz/cdk-identity-pool-example to replicate Auth. The remaining building blocks are just plain DynamoDB and S3 constructs from CDK. |
I plan to use more and more cdk as well but figure I can still just use it within amplify. But if I continue to run into hurdles like this I guess completely breaking free makes sense. Thanks for the suggestion. Will try it out |
I can report that |
I raised this issue(aws/aws-cdk#18090) with |
We need this as well, or at least the ability to output variables from custom CDK code to the rest of amplify. Our use case is that we have a complex video encoding pipeline setup in the CDK and we need to be able to bundle customized lambda functions to work with it. We need role and bucket ARNs that are created in the CDK and we can't get them over to the amplify functions. |
@kaustavghosh06 as @DylanBruzenak said:
We also need this ability.. |
any update for this issue ? |
I tried to create a AwsCustomResource like
CDK would create the lambda function automatically and upload the code to S3 bucket. I was checking the cloudformation template created and I can only see the reference to the S3 Bucket, but no creation and upload functionality and get this error on amplify push
|
Any update? |
+1 |
@ykethan With the recent announcement of Amplify CLI Beta supporting CDK v2, does that come with proper support for deploying the full Cloud Assembly, including the assets? |
It would be awesome if the documentation here: https://docs.amplify.aws/cli/custom/cdk/ could reflect the limitations described in this issue. |
Ditto. Disappointing waste of time thinking you can use the cdk as usual with a custom resource only to run into this issue. |
I run into the same problem with We've got two big projects that started out with Amplify, but then we added some extra stuff with CDK (like EventBridge and Step Functions as workflows). I thought it would be awesome to combine everything into one big repo with Amplify, CDK, and our frontend (which is SvelteKit, obviously!), but it's not quite doable yet. Amplify is great for getting Auth, Storage and AppSync with DynamoDB up and running quickly, but we end up doing a lot of custom coding after that. If CDK and Amplify played nicely together, it would be an amazing tool for rapid development. |
Precisely! I ended up ditching Amplify for everything except Auth, Hosting, & Appsync. Everything else I'm deploying w/ CDK in a separate repo. |
We're doing everything with pure CDK these days. Works pretty well once you get everything rolling. |
Really sad to have to go that way but here is a working solution for a flink application (same principle apply for lambda or other asset) with hook : amplify/hooks/pre-push.js const fs = require('fs');
const path = require('path');
const AWS = require('aws-sdk');
const zip = require('adm-zip');
const crypto = require('crypto');
try {
const parameters = JSON.parse(fs.readFileSync(0, { encoding: 'utf8' }));
// console.log('Parameters: ', JSON.stringify(parameters));
// Retrieve amplify env
const { envName } = parameters.data.amplify.environment;
// console.log('Amplify envName: ', envName);
// Retrieve the S3 bucket name from the amplify/team-provider-info.json
const teamProviderInfo = JSON.parse(
fs.readFileSync(path.join(__dirname, '../team-provider-info.json'), {
encoding: 'utf8',
})
);
// console.log('teamProviderInfo: ', JSON.stringify(teamProviderInfo));
const s3BucketName =
teamProviderInfo[envName].awscloudformation.DeploymentBucketName;
// Load profile used by amplify
const localInfo = JSON.parse(
fs.readFileSync(path.join(__dirname, '../.config/local-aws-info.json'), {
encoding: 'utf8',
})
);
const profile = localInfo[envName].profileName;
// console.log('Profile: ', profile);
// TODO: Add envName to the zip file name
// Zip content of amplify/backend/custom/measurementAggregator/flink as flink-{hash}.zip
const flinkCodePath = path.join(
__dirname,
'../backend/custom/measurementAggregator/flink'
);
// Calculate hash of flink/tumbling-windows.py file
const hash = crypto.createHash('sha256');
hash.update(fs.readFileSync(path.join(flinkCodePath, 'tumbling-windows.py')));
const hashValue = hash.digest('hex');
const zipDestinationPath = path.join(__dirname, `flink-${hashValue}.zip`);
// Check if zip file already exists. if it does that means content has not changed and we can skip the upload
if (!fs.existsSync(zipDestinationPath)) {
// eslint-disable-next-line new-cap
const zipFile = new zip();
zipFile.addLocalFolder(flinkCodePath);
zipFile.writeZip(zipDestinationPath);
// Upload zipDestinationPath to the S3 bucket
const credentials = new AWS.SharedIniFileCredentials({ profile });
AWS.config.credentials = credentials;
const s3 = new AWS.S3();
const uploadParams = {
Bucket: s3BucketName,
Key: zipDestinationPath.split('/').pop(),
Body: fs.readFileSync(zipDestinationPath),
};
s3.upload(uploadParams, (err, data) => {
if (err) {
console.error('Error', err);
throw err;
}
if (data) {
console.log('Upload Success', data.Location);
process.exit(0);
}
});
} else {
console.log('No changes detected. Skipping upload');
process.exit(0);
}
} catch (error) {
console.log(error);
process.exit(1);
} amplify/backend/custom/measurementAggregator/cdk-stack.ts import * as cdk from '@aws-cdk/core';
import * as AmplifyHelpers from '@aws-amplify/cli-extensibility-helper';
import { AmplifyDependentResourcesAttributes } from '../../types/amplify-dependent-resources-ref';
import * as iam from '@aws-cdk/aws-iam';
import * as lambda from '@aws-cdk/aws-lambda';
import * as kinesis from '@aws-cdk/aws-kinesis';
import * as flink from '@aws-cdk/aws-kinesisanalytics-flink';
import { KinesisEventSource } from '@aws-cdk/aws-lambda-event-sources';
import * as s3 from '@aws-cdk/aws-s3';
import * as crypto from 'crypto';
import * as fs from 'fs';
import * as path from 'path';
export class cdkStack extends cdk.Stack {
constructor(
scope: cdk.Construct,
id: string,
props?: cdk.StackProps,
amplifyResourceProps?: AmplifyHelpers.AmplifyResourceProps
) {
super(scope, id, props);
/* Do not remove - Amplify CLI automatically injects the current deployment environment in this input parameter */
new cdk.CfnParameter(this, 'env', {
type: 'String',
description: 'Current Amplify CLI env name',
});
const beforeAgregate = new kinesis.Stream(this, 'BeforeAgregate', {
streamName: 'AldoBeforeAggregate',
});
// Calculate hash of flink/tumbling-windows.py file
const hash = crypto.createHash('sha256');
hash.update(fs.readFileSync(path.join(__dirname, '../flink/tumbling-windows.py')));
const hashValue = hash.digest('hex');
const fileKey = `flink-${hashValue}.zip`;
// Get the deployment bucket name from the amplify meta file
const amplifyProjectInfo = AmplifyHelpers.getProjectInfo();
console.log('amplifyProjectInfo', JSON.stringify(amplifyProjectInfo));
const envName = amplifyProjectInfo.envName;
const teamProviderInfo = JSON.parse(
fs.readFileSync(path.join(__dirname, '../../../../team-provider-info.json'), {
encoding: 'utf8',
})
);
const s3BucketName = teamProviderInfo[envName].awscloudformation.DeploymentBucketName;
console.log('s3BucketName', s3BucketName);
const bucket = s3.Bucket.fromBucketName(this, 'FlinkAppCodeBucket', s3BucketName);
const afterAgreagte = new kinesis.Stream(this, 'AfterAgreagte', {
streamName: 'AldoAfterAggregate',
});
const propertyGroups: flink.PropertyGroups = {
'consumer.config.0': {
'aws.region': cdk.Aws.REGION,
'input.stream.name': beforeAgregate.streamName,
'scan.stream.initpos': 'LATEST',
},
'kinesis.analytics.flink.run.options': {
jarfile: 'flink-sql-connector-kinesis-1.15.2.jar',
python: 'tumbling-windows.py',
},
'producer.config.0': {
'aws.region': cdk.Aws.REGION,
'output.stream.name': afterAgreagte.streamName,
'shard.count': '4',
},
};
const agregateStreams = new flink.Application(this, 'App', {
code: flink.ApplicationCode.fromBucket(bucket, fileKey),
runtime: flink.Runtime.of('FLINK-1_15'),
propertyGroups: propertyGroups,
role: new iam.Role(this, 'Role', {
assumedBy: new iam.ServicePrincipal('kinesisanalytics.amazonaws.com'),
inlinePolicies: {
FlinkPolicy: new iam.PolicyDocument({
statements: [
new iam.PolicyStatement({
actions: ['s3:GetObject*', 's3:GetBucket*', 's3:List*'],
resources: [bucket.bucketArn, bucket.bucketArn + '/*'],
}),
],
}),
},
}),
});
beforeAgregate.grantRead(agregateStreams);
afterAgreagte.grantWrite(agregateStreams);
// Load the lambda function that will produce and consume the streams
// TODO: fix ... not working at all
const retVal: AmplifyDependentResourcesAttributes = AmplifyHelpers.addResourceDependency(
this,
amplifyResourceProps?.category ?? 'custom',
amplifyResourceProps?.resourceName ?? 'SimulationEngine',
[
{
category: 'function',
resourceName: 'SmartDashShelfMeasurementIngestion',
},
]
);
const consumerProducerFunction = lambda.Function.fromFunctionArn(
this,
'SmartDashShelfMeasurementIngestion',
cdk.Fn.ref(retVal.function.SmartDashShelfMeasurementIngestion.Arn)
);
beforeAgregate.grantWrite(consumerProducerFunction);
afterAgreagte.grantRead(consumerProducerFunction);
// Trigger consumerProducerFunction when a new record is added to the afterAgregate stream
consumerProducerFunction.addEventSource(
new KinesisEventSource(afterAgreagte, {
startingPosition: lambda.StartingPosition.LATEST,
})
);
}
} Long story short :
Interesting logic used to get profile and deployment bucket inside ;) |
Noting here that the reason Amplify custom resource stacks don't work beyond lvl1 resources very well is because the build custom resource step runs tsc vs whatever cdk is doing. cdk likely does role reconciliation dependencies after a compilation or something. Not to mention cdk has a "bootstrapped" bucket to push stuff to, not unlike Amplify's deployment bucket. I feel like they could be blended. |
I've been kicking around a way to make cdk and amplify more Reese cup and less salad dressing. Using a combination of amplify init, the amplify export to cdk, and an enhanced cdk construct, I'm able to have a reasonable workflow that utilities the best of amplify cli and cdk extensibility. The main change is all amplify export and never amplify push. Just npm run deploy which runs the following.
Example repo here: https://github.com/joekiller/amplify-scratch |
Is there any update on fixing that issue? I would like to deploy static files to S3 bucket with CDK's |
This ticket has been open since 2 years now. Many people are complaining. Could you please increase priority of this issue? |
Any update? I'd like to be able to use cloudfront.experimental.EdgeFunction() and using add function in the CLI doesn't support that. Reason being is I want to be able to intercept requests to CloudFront and extract parameters from the URL string - this is needed for AWS MarketPlace integration. |
I also have the same use case on Gen2. |
Before opening, please confirm:
How did you install the Amplify CLI?
No response
If applicable, what version of Node.js are you using?
No response
Amplify CLI Version
7.4.5
What operating system are you using?
Mac
Amplify Categories
Not applicable
Amplify Commands
push
Describe the bug
Unable to deploy a lambda function via custom CDK stack through amplify. Same example works in a standalone CDK project. Error seems to imply that code assets are not built. I also don't see my function code zipped anywhere as CDK usually does.
Expected behavior
Lambda function deployed.
Reproduction steps
GraphQL schema(s)
# Put schemas below this line
Log output
Additional information
No response
The text was updated successfully, but these errors were encountered: