Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support deploying lambda functions with custom CDK category #9055

Open
4 tasks done
shishkin opened this issue Nov 23, 2021 · 32 comments
Open
4 tasks done

Support deploying lambda functions with custom CDK category #9055

shishkin opened this issue Nov 23, 2021 · 32 comments
Assignees
Labels
extensibility Issues related to expand or customize current configuration feature-request Request a new feature functions Issues tied to the functions category p3

Comments

@shishkin
Copy link

Before opening, please confirm:

  • I have installed the latest version of the Amplify CLI (see above), and confirmed that the issue still persists.
  • I have searched for duplicate or closed issues.
  • I have read the guide for submitting bug reports.
  • I have done my best to include a minimal, self-contained set of instructions for consistently reproducing the issue.

How did you install the Amplify CLI?

No response

If applicable, what version of Node.js are you using?

No response

Amplify CLI Version

7.4.5

What operating system are you using?

Mac

Amplify Categories

Not applicable

Amplify Commands

push

Describe the bug

Unable to deploy a lambda function via custom CDK stack through amplify. Same example works in a standalone CDK project. Error seems to imply that code assets are not built. I also don't see my function code zipped anywhere as CDK usually does.

Expected behavior

Lambda function deployed.

Reproduction steps

  1. Add custom CDK category
  2. Define lambda in CDK stack:
    const fn = new lambda.Function(this, "fn", {
      runtime: lambda.Runtime.NODEJS_14_X,
      code: lambda.Code.fromAsset(`${cdkPath}/functions`),
      handler: "hello",
    });
  1. amplify push

GraphQL schema(s)

# Put schemas below this line

Log output

# Put your logs below this line

UPDATE_FAILED               customcdk                    AWS::CloudFormation::Stack Tue Nov 23 2021 19:39:22 GMT+0100 (Central European Standard Time) Parameters: [AssetParametersc2d8a94782daaaf9412bb34f1a08f5e68ab8d9b98a7073fe0aac5317505259dcS3Bucket59FE5BCE, AssetParametersc2d8a94782daaaf9412bb34f1a08f5e68ab8d9b98a7073fe0aac5317505259dcS3VersionKeyC246F305, AssetParametersc2d8a94782daaaf9412bb34f1a08f5e68ab8d9b98a7073fe0aac5317505259dcArtifactHash2AAFE25B] must have values

Additional information

No response

@kaustavghosh06
Copy link
Contributor

@shishkin Yes, the Amplify CLI does not package and upload the lambda function custom source code for you. You can use amplify add function for adding Lambda's where the CLI takes care of the packaging for you.

@kaustavghosh06 kaustavghosh06 changed the title Unable to deploy lambda function with custom CDK category Support deploying lambda functions with custom CDK category Nov 23, 2021
@shishkin
Copy link
Author

Thanks for clarification @kaustavghosh06 . Maybe I don't understand how Amplify integrates CDK, but this seems unfortunate since CDK can package assets. Is there a way to use custom build steps or hooks to get CDK behavior integrated into Amplify?

@kaustavghosh06
Copy link
Contributor

kaustavghosh06 commented Nov 23, 2021

Yeah, at this moment we don't support packaging assets out of the box. But you can try using
a pre push hook - https://docs.amplify.aws/cli/project/command-hooks/ to package the Lambda assets, but haven't personally tried it with it.

Also, any specific reason you won't lean on using amplify add function to manage your Lambda functions?

@shishkin
Copy link
Author

Because I need custom resources provisioned and my lambda configured with env vars pointing to them. I couldn't find a way to get Arn outputs from CDK to Amplify lambda.

@shishkin
Copy link
Author

Another issue I run into with Amplify functions is that S3 triggers from Amplify Storage setup object key filters that I can't figure out how to configure.

@dave-moser
Copy link

I have run into this issue as well when trying to use the Amplify+CDK to deploy Lambda Functions from Docker images. The CDK will perform the docker build to create the docker image on the host machine and upload it to ECR. This does not work with the Amplify+CDK. I use this method of the CDK as I am deploying Lambda functions in Swift, which are not supported by Amplify with amplify add function.

Is asset packaging something that is on the roadmap?

@lazpavel lazpavel added feature-request Request a new feature functions Issues tied to the functions category labels Nov 29, 2021
@andre347
Copy link

andre347 commented Nov 30, 2021

Found this thread after spending an afternoon trying to package up my custom lambda created in CDK. This doesn't work according to this thread. However, if we use amplify add function and try to reference it within an SNS topic (via new LambdaSubscription) created in CDK I get TypeScript errors. I've created a function via cdk.Fn.ref(dependencies.function.<functionName>.arn) but can't use it in the SNS subscription because it just returns a string (the arn name). How can I reference this function in my CDK?

Edit: I actually solved this. You need can reference a new Lambda.function from an ARN. I used the ARN that I created from the cdk.Fn.ref function.

@arturlr
Copy link

arturlr commented Dec 7, 2021

I have been using Amplify extensively and before having the ability to use a custom CDK code all my backend was typically done using AWS SAM. I recently started migrating some of my cloudformation backend that contains IoT resources, queues, layers and Lambdas to CDK. The reason is that when you have multiple lambda functions It is much more convenient to use CDK than using amplify add function. It would be tremendously beneficial to fully support CDK deployments using the Amplify custom CDK as it would make amplify controls the entire front/back-ends deploy.

@oste
Copy link

oste commented Dec 17, 2021

I would also like to voice my support for using code: lambda.Code.fromAsset(... in custom cdk resources. I have a similar use case as @shishkin for adding additional s3 trigger functions and configuring prefixes, however, being able to use local function assets seems generally useful.

@shishkin
Copy link
Author

I haven't tried it with Amplify myself, but this CDK module might do asset bundling as part of its resource synthesis and thus should not rely on CDK CLI to do bundling: https://docs.aws.amazon.com/cdk/api/latest/docs/aws-lambda-nodejs-readme.html. There is also this community construct for ESBuild: https://github.com/mrgrain/cdk-esbuild.

I just switched to using plain CDK instead of Amplify CLI as I found it's more productive over the long run to learn what Amplify is doing under the hood and replicate it with CDK, than waste days over days troubleshooting Amplify's confusing error messages and work around its idiosyncrasies. Some resources I've found for that: https://serverless-nextjs.com/docs/cdkconstruct/ to replicate Hosting and https://github.com/bobbyhadz/cdk-identity-pool-example to replicate Auth. The remaining building blocks are just plain DynamoDB and S3 constructs from CDK.

@oste
Copy link

oste commented Dec 17, 2021

I plan to use more and more cdk as well but figure I can still just use it within amplify. But if I continue to run into hurdles like this I guess completely breaking free makes sense. Thanks for the suggestion. Will try it out

@oste
Copy link

oste commented Dec 18, 2021

I can report that @aws-cdk/aws-lambda-nodejs has the same "Parameters must have values" issue, unfortunately.

@oste
Copy link

oste commented Dec 20, 2021

I raised this issue(aws/aws-cdk#18090) with aws-cdk since what looked to be a promising addEventNotification function didn't seem to work with imported resources.

@DylanBruzenak
Copy link

DylanBruzenak commented Jan 19, 2022

We need this as well, or at least the ability to output variables from custom CDK code to the rest of amplify. Our use case is that we have a complex video encoding pipeline setup in the CDK and we need to be able to bundle customized lambda functions to work with it. We need role and bucket ARNs that are created in the CDK and we can't get them over to the amplify functions.

@nathanagez
Copy link

@kaustavghosh06 as @DylanBruzenak said:

or at least the ability to output variables from custom CDK code to the rest of amplify.

We also need this ability..

@chakch
Copy link

chakch commented Feb 24, 2022

any update for this issue ?

@InnovateWithEric InnovateWithEric added p3 extensibility Issues related to expand or customize current configuration labels Mar 3, 2022
@Punith13
Copy link

Punith13 commented Apr 7, 2022

I tried to create a AwsCustomResource like

const ConfigurationSetName = 'defaultConfigSet';
    const configSet = new AwsCustomResource(this, ConfigurationSetName, {
        onUpdate: {
            service: 'SESV2',
            action: 'createConfigurationSet',
            parameters: {
                ConfigurationSetName,
                SendingOptions: { SendingEnabled: true },
            },
            physicalResourceId: {},
        },
        onDelete: {
            service: 'SESV2',
            action: 'deleteConfigurationSet',
            parameters: {
                ConfigurationSetName,
            },
        },
        policy: AwsCustomResourcePolicy.fromStatements([sesPolicy]),
        logRetention: 7,
    });

CDK would create the lambda function automatically and upload the code to S3 bucket. I was checking the cloudformation template created and I can only see the reference to the S3 Bucket, but no creation and upload functionality and get this error on amplify push

  • Error occurred while GetObject. S3 Error Code: NoSuchBucket. S3 Error Message: The specified bucket does not exist (Service: Lambda, Status Code: 400

@ezalorsara
Copy link

Any update?

@andreav
Copy link

andreav commented Oct 21, 2022

+1

@mmoulton
Copy link

mmoulton commented Dec 2, 2022

@ykethan With the recent announcement of Amplify CLI Beta supporting CDK v2, does that come with proper support for deploying the full Cloud Assembly, including the assets?

@renschler
Copy link

renschler commented Jan 9, 2023

It would be awesome if the documentation here: https://docs.amplify.aws/cli/custom/cdk/ could reflect the limitations described in this issue.

@csmcallister
Copy link

It would be awesome if the documentation here: https://docs.amplify.aws/cli/custom/cdk/ could reflect the limitations described in this issue.

Ditto. Disappointing waste of time thinking you can use the cdk as usual with a custom resource only to run into this issue.

@asmajlovicmars
Copy link

I run into the same problem with code: lambda.Code.fromAsset(.... It's kind of a pain because we're using Amplify for some things, but then when we want to add some extra policies or memory size to our functions, we have to do it manually in CloudFormation. This means more custom code in different places, and that's not really what we're looking for.

We've got two big projects that started out with Amplify, but then we added some extra stuff with CDK (like EventBridge and Step Functions as workflows). I thought it would be awesome to combine everything into one big repo with Amplify, CDK, and our frontend (which is SvelteKit, obviously!), but it's not quite doable yet.

Amplify is great for getting Auth, Storage and AppSync with DynamoDB up and running quickly, but we end up doing a lot of custom coding after that. If CDK and Amplify played nicely together, it would be an amazing tool for rapid development.

@renschler
Copy link

I thought it would be awesome to combine everything into one big repo with Amplify, CDK, and our frontend (which is SvelteKit, obviously!), but it's not quite doable yet.

Amplify is great for getting Auth, Storage and AppSync with DynamoDB up and running quickly, but we end up doing a lot of custom coding after that. If CDK and Amplify played nicely together, it would be an amazing tool for rapid development.

Precisely! I ended up ditching Amplify for everything except Auth, Hosting, & Appsync.

Everything else I'm deploying w/ CDK in a separate repo.

@DylanBruzenak
Copy link

We're doing everything with pure CDK these days. Works pretty well once you get everything rolling.

@josefaidt josefaidt self-assigned this Mar 3, 2023
@flochaz
Copy link

flochaz commented Mar 16, 2023

Yeah, at this moment we don't support packaging assets out of the box. But you can try using a pre push hook - https://docs.amplify.aws/cli/project/command-hooks/ to package the Lambda assets, but haven't personally tried it with it.

Also, any specific reason you won't lean on using amplify add function to manage your Lambda functions?

Really sad to have to go that way but here is a working solution for a flink application (same principle apply for lambda or other asset) with hook :

amplify/hooks/pre-push.js

const fs = require('fs');
const path = require('path');
const AWS = require('aws-sdk');
const zip = require('adm-zip');
const crypto = require('crypto');

try {
  const parameters = JSON.parse(fs.readFileSync(0, { encoding: 'utf8' }));
  // console.log('Parameters: ', JSON.stringify(parameters));

  // Retrieve amplify env
  const { envName } = parameters.data.amplify.environment;
  // console.log('Amplify envName: ', envName);

  // Retrieve the S3 bucket name from the amplify/team-provider-info.json
  const teamProviderInfo = JSON.parse(
    fs.readFileSync(path.join(__dirname, '../team-provider-info.json'), {
      encoding: 'utf8',
    })
  );
  // console.log('teamProviderInfo: ', JSON.stringify(teamProviderInfo));
  const s3BucketName =
    teamProviderInfo[envName].awscloudformation.DeploymentBucketName;

  // Load profile used by amplify
  const localInfo = JSON.parse(
    fs.readFileSync(path.join(__dirname, '../.config/local-aws-info.json'), {
      encoding: 'utf8',
    })
  );

  const profile = localInfo[envName].profileName;

  // console.log('Profile: ', profile);

  // TODO: Add envName to the zip file name
  // Zip content of amplify/backend/custom/measurementAggregator/flink as flink-{hash}.zip
  const flinkCodePath = path.join(
    __dirname,
    '../backend/custom/measurementAggregator/flink'
  );

  // Calculate hash of flink/tumbling-windows.py file
  const hash = crypto.createHash('sha256');
  hash.update(fs.readFileSync(path.join(flinkCodePath, 'tumbling-windows.py')));
  const hashValue = hash.digest('hex');

  const zipDestinationPath = path.join(__dirname, `flink-${hashValue}.zip`);

  // Check if zip file already exists. if it does that means content has not changed and we can skip the upload
  if (!fs.existsSync(zipDestinationPath)) {
    // eslint-disable-next-line new-cap
    const zipFile = new zip();
    zipFile.addLocalFolder(flinkCodePath);
    zipFile.writeZip(zipDestinationPath);

    // Upload zipDestinationPath to the S3 bucket
    const credentials = new AWS.SharedIniFileCredentials({ profile });
    AWS.config.credentials = credentials;
    const s3 = new AWS.S3();

    const uploadParams = {
      Bucket: s3BucketName,
      Key: zipDestinationPath.split('/').pop(),
      Body: fs.readFileSync(zipDestinationPath),
    };

    s3.upload(uploadParams, (err, data) => {
      if (err) {
        console.error('Error', err);
        throw err;
      }
      if (data) {
        console.log('Upload Success', data.Location);
        process.exit(0);
      }
    });
  } else {
    console.log('No changes detected. Skipping upload');
    process.exit(0);
  }
} catch (error) {
  console.log(error);
  process.exit(1);
}

amplify/backend/custom/measurementAggregator/cdk-stack.ts

import * as cdk from '@aws-cdk/core';
import * as AmplifyHelpers from '@aws-amplify/cli-extensibility-helper';
import { AmplifyDependentResourcesAttributes } from '../../types/amplify-dependent-resources-ref';
import * as iam from '@aws-cdk/aws-iam';
import * as lambda from '@aws-cdk/aws-lambda';
import * as kinesis from '@aws-cdk/aws-kinesis';
import * as flink from '@aws-cdk/aws-kinesisanalytics-flink';
import { KinesisEventSource } from '@aws-cdk/aws-lambda-event-sources';
import * as s3 from '@aws-cdk/aws-s3';
import * as crypto from 'crypto';
import * as fs from 'fs';
import * as path from 'path';

export class cdkStack extends cdk.Stack {
  constructor(
    scope: cdk.Construct,
    id: string,
    props?: cdk.StackProps,
    amplifyResourceProps?: AmplifyHelpers.AmplifyResourceProps
  ) {
    super(scope, id, props);
    /* Do not remove - Amplify CLI automatically injects the current deployment environment in this input parameter */
    new cdk.CfnParameter(this, 'env', {
      type: 'String',
      description: 'Current Amplify CLI env name',
    });

    const beforeAgregate = new kinesis.Stream(this, 'BeforeAgregate', {
      streamName: 'AldoBeforeAggregate',
    });

    // Calculate hash of flink/tumbling-windows.py file
    const hash = crypto.createHash('sha256');
    hash.update(fs.readFileSync(path.join(__dirname, '../flink/tumbling-windows.py')));
    const hashValue = hash.digest('hex');
    const fileKey = `flink-${hashValue}.zip`;

    // Get the deployment bucket name from the amplify meta file
    const amplifyProjectInfo = AmplifyHelpers.getProjectInfo();
    console.log('amplifyProjectInfo', JSON.stringify(amplifyProjectInfo));
    const envName = amplifyProjectInfo.envName;
    const teamProviderInfo = JSON.parse(
      fs.readFileSync(path.join(__dirname, '../../../../team-provider-info.json'), {
        encoding: 'utf8',
      })
    );
    const s3BucketName = teamProviderInfo[envName].awscloudformation.DeploymentBucketName;
    console.log('s3BucketName', s3BucketName);
    
    const bucket = s3.Bucket.fromBucketName(this, 'FlinkAppCodeBucket', s3BucketName);

    const afterAgreagte = new kinesis.Stream(this, 'AfterAgreagte', {
      streamName: 'AldoAfterAggregate',
    });

    const propertyGroups: flink.PropertyGroups = {
      'consumer.config.0': {
        'aws.region': cdk.Aws.REGION,
        'input.stream.name': beforeAgregate.streamName,
        'scan.stream.initpos': 'LATEST',
      },
      'kinesis.analytics.flink.run.options': {
        jarfile: 'flink-sql-connector-kinesis-1.15.2.jar',
        python: 'tumbling-windows.py',
      },
      'producer.config.0': {
        'aws.region': cdk.Aws.REGION,
        'output.stream.name': afterAgreagte.streamName,
        'shard.count': '4',
      },
    };

    const agregateStreams = new flink.Application(this, 'App', {
      code: flink.ApplicationCode.fromBucket(bucket, fileKey),
      runtime: flink.Runtime.of('FLINK-1_15'),
      propertyGroups: propertyGroups,
      role: new iam.Role(this, 'Role', {
        assumedBy: new iam.ServicePrincipal('kinesisanalytics.amazonaws.com'),
        inlinePolicies: {
          FlinkPolicy: new iam.PolicyDocument({
            statements: [
              new iam.PolicyStatement({
                actions: ['s3:GetObject*', 's3:GetBucket*', 's3:List*'],
                resources: [bucket.bucketArn, bucket.bucketArn + '/*'],
              }),
            ],
          }),
        },
      }),
    });

    beforeAgregate.grantRead(agregateStreams);
    afterAgreagte.grantWrite(agregateStreams);

    // Load the lambda function that will produce and consume the streams
    // TODO: fix ... not working at all
    const retVal: AmplifyDependentResourcesAttributes = AmplifyHelpers.addResourceDependency(
      this,
      amplifyResourceProps?.category ?? 'custom',
      amplifyResourceProps?.resourceName ?? 'SimulationEngine',
      [
        {
          category: 'function',
          resourceName: 'SmartDashShelfMeasurementIngestion',
        },
      ]
    );

    const consumerProducerFunction = lambda.Function.fromFunctionArn(
      this,
      'SmartDashShelfMeasurementIngestion',
      cdk.Fn.ref(retVal.function.SmartDashShelfMeasurementIngestion.Arn)
    );

    beforeAgregate.grantWrite(consumerProducerFunction);
    afterAgreagte.grantRead(consumerProducerFunction);

    // Trigger consumerProducerFunction when a new record is added to the afterAgregate stream
    consumerProducerFunction.addEventSource(
      new KinesisEventSource(afterAgreagte, {
        startingPosition: lambda.StartingPosition.LATEST,
      })
    );
  }
}

Long story short :

  • In prehook, calculate hash (to avoid uploading each time and updating stack for nothing), zip asset, upload to deployment bucket
  • In CDK, same, calculate hash use this as ref in your asset fromBucket ... The key being to make sure the same fileName/S3Key is used in both pre-push hook and cdk

Interesting logic used to get profile and deployment bucket inside ;)

@joekiller
Copy link
Contributor

joekiller commented May 26, 2023

Noting here that the reason Amplify custom resource stacks don't work beyond lvl1 resources very well is because the build custom resource step runs tsc vs whatever cdk is doing. cdk likely does role reconciliation dependencies after a compilation or something. Not to mention cdk has a "bootstrapped" bucket to push stuff to, not unlike Amplify's deployment bucket. I feel like they could be blended.

@joekiller
Copy link
Contributor

joekiller commented May 30, 2023

I've been kicking around a way to make cdk and amplify more Reese cup and less salad dressing. Using a combination of amplify init, the amplify export to cdk, and an enhanced cdk construct, I'm able to have a reasonable workflow that utilities the best of amplify cli and cdk extensibility.

The main change is all amplify export and never amplify push. Just npm run deploy which runs the following.

amplify export --out cdk/lib --yes && cd cdk && npm install && cdk deploy --require-approval never exported-amplify-backend-stack && cd -

Example repo here: https://github.com/joekiller/amplify-scratch

@mateuszboryn
Copy link

Is there any update on fixing that issue?

I would like to deploy static files to S3 bucket with CDK's
https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_s3_assets-readme.html
or
https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_s3_deployment-readme.html
but it doesn't work with Amplify. It doesn't upload anything to CDK's assets buckets.

@mateuszboryn
Copy link

This ticket has been open since 2 years now. Many people are complaining. Could you please increase priority of this issue?
I think that in Firebase there are no such hidden issues.

@electronicalias
Copy link

Any update? I'd like to be able to use cloudfront.experimental.EdgeFunction() and using add function in the CLI doesn't support that.

Reason being is I want to be able to intercept requests to CloudFront and extract parameters from the URL string - this is needed for AWS MarketPlace integration.

@naedx
Copy link
Contributor

naedx commented Jul 3, 2024

I also have the same use case on Gen2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
extensibility Issues related to expand or customize current configuration feature-request Request a new feature functions Issues tied to the functions category p3
Projects
None yet
Development

No branches or pull requests