Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Updating versions of protobuf (all languages) and cpp dependent libraries #6069

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

jcferretti
Copy link
Member

No description provided.

@jcferretti jcferretti self-assigned this Sep 15, 2024
@jcferretti jcferretti added this to the 0.37.0 milestone Sep 15, 2024
@jcferretti jcferretti changed the title Updating versions of protobuf (all languages) and cpp dependent libraries chore: Updating versions of protobuf (all languages) and cpp dependent libraries Sep 15, 2024
@deephaven deephaven deleted a comment from github-actions bot Sep 15, 2024
@jcferretti jcferretti marked this pull request as draft September 17, 2024 18:52
@jcferretti jcferretti assigned niloc132 and unassigned jcferretti Sep 17, 2024
@jcferretti
Copy link
Member Author

======================================================================
ERROR [0.000s]: test_protobuf_spec (tests.test_kafka_consumer.KafkaConsumerTestCase) [regular]
Check an Protobuf Kafka subscription creates the right table.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/python/deephaven/stream/kafka/consumer.py", line 258, in _consume
    j_table=_JKafkaTools.consumeToTable(
RuntimeError: java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    io/confluent/kafka/schemaregistry/protobuf/ProtobufSchema.toMessage(Lcom/google/protobuf/DescriptorProtos$FileDescriptorProto;Lcom/google/protobuf/DescriptorProtos$DescriptorProto;)Lcom/squareup/wire/schema/internal/parser/MessageElement; @679: invokestatic
  Reason:
    Type 'com/google/protobuf/DescriptorProtos$MessageOptions' (current frame, stack[1]) is not assignable to 'com/google/protobuf/GeneratedMessageV3$ExtendableMessage'
  Current Frame:
    bci: @679
    flags: { }
    locals: { 'com/google/protobuf/DescriptorProtos$FileDescriptorProto', 'com/google/protobuf/DescriptorProtos$DescriptorProto', 'java/lang/String', 'com/google/common/collect/ImmutableList$Builder', 'com/google/common/collect/ImmutableList$Builder', 'com/google/common/collect/ImmutableList$Builder', 'com/google/common/collect/ImmutableList$Builder', 'java/util/LinkedHashMap', 'java/util/List', 'com/google/common/collect/ImmutableList$Builder' }
    stack: { 'com/google/common/collect/ImmutableList$Builder', 'com/google/protobuf/DescriptorProtos$MessageOptions' }
  Bytecode:
[removed]

	at io.confluent.kafka.schemaregistry.protobuf.ProtobufSchemaProvider.parseSchemaOrElseThrow(ProtobufSchemaProvider.java:38)
	at io.confluent.kafka.schemaregistry.SchemaProvider.parseSchema(SchemaProvider.java:75)
	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.parseSchema(CachedSchemaRegistryClient.java:301)
	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:347)
	at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:472)
	at io.deephaven.kafka.ProtobufImpl.descriptor(ProtobufImpl.java:231)
	at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.setDescriptor(ProtobufImpl.java:118)
	at io.deephaven.kafka.ProtobufImpl$ProtobufConsumeImpl.getDeserializer(ProtobufImpl.java:102)
	at io.deephaven.kafka.KafkaTools.getConsumeStruct(KafkaTools.java:1257)
	at io.deephaven.kafka.KafkaTools.consume(KafkaTools.java:1347)
	at io.deephaven.kafka.KafkaTools.consumeToTable(KafkaTools.java:1020)


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/python/tests/test_kafka_consumer.py", line 163, in test_protobuf_spec
    t = consume(
  File "/python/tests/test_kafka_consumer.py", line 151, in consume
    return ck.consume(
  File "/python/deephaven/stream/kafka/consumer.py", line 172, in consume
    return _consume(kafka_config, topic, partitions, offsets, key_spec, value_spec, table_type, to_partitioned=False)
  File "/python/deephaven/stream/kafka/consumer.py", line 279, in _consume
    raise DHError(e, "failed to consume a Kafka stream.") from e
deephaven.dherror.DHError: failed to consume a Kafka stream. : RuntimeError: java.lang.VerifyError: Bad type on operand stack
Traceback (most recent call last):
  File "/python/deephaven/stream/kafka/consumer.py", line 258, in _consume
    j_table=_JKafkaTools.consumeToTable(
RuntimeError: java.lang.VerifyError: Bad type on operand stack

https://github.com/deephaven/deephaven-core/actions/runs/10896144753/job/30235470489?pr=6069#step:8:11127

Remaining failure related to:

confluentinc/schema-registry#3047

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants