-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-755757: Unwanted INFO messages from arrow memory allocation manager #1095
Comments
@hchenatsafe can you give us a bit more details on how you're setting up your logging? I just ran a test using 3.13.20 and configured log4j by setting the package net.snowflake.client to INFO. I don't see any messages at all, including the messages you included. |
Closing this since we're unable to reproduce the issue. |
I don't have an isolated repro to share, but these logs seem to be introduced in v3.13.18. These logs are sent to stderr. |
@hchenatsafe The log messages were introduced by a change in the Apache Arrow version, but my confusion with the problem described here was related to the logging configurations for your application when this is printed out. Are you saying that you don't have debugging enabled for the JDBC driver and this is printed out directly to standard error? |
I'm not clear on how our logging is currently configured, but this is the first time I've seen logging appear in a spot unexpectedly. As far as I know, debugging is not enabled for the driver, and the logging is going directly to stderr |
reporting that similar issue observed on my side when access snowflake via JDBC on M1 machine. The exception stack and trace indicates exception is thrown out when JDBC driver tried to box the response from Snowflake in Arrow format. More details found below: Std err:
Trace:
|
Hi @zhongzheng-instacart This issue is mainly talking about the INFO messages you pointed out to in the trace log. Those are printed out by the Apache Arrow memory library code and were introduced starting with Arrow 7.0, which is the library version of Apache Arrow that we upgraded to in the JDBC driver starting with version 3.13.18. You could suppress those messages using the JDK logger configuration: Then add the following entry in the logging.properties file: If you don't pass a logging.properties file then the default configuration is taken from the logging.properties file under Theoretically, you should be able to adjust the logging level for the apache.arrow.memory package using log4j, but that doesn't seem to be working as expected. From my perspective, I believe that's the real issue that should be addressed. |
Closing this issue as it is no longer reproducible for the latest jdbc driver - Related change: https://github.com/snowflakedb/snowflake-jdbc/pull/1202/files#diff-e67a1908492d15b4743e498e89ff01f892a9dd035b66a2afc5820db93039885f |
Hello! I think I have the same issue with the most recent
Even though I'm using SLF4J with this JVM arg:
which is working properly because I have this logger config in
Also, I noticed that the arrow messages are printed to stderr, instead of stdout for other messages. |
hi there - could you please open a new issue with the details, if it still happens for you? using the one which is closed for 1.5 years now doesn't get the necessary attention to the issue. Thank you in advance ! |
You're right! Here it is: #1956 |
After updating Snowflake JDBC driver from 3.13.14 to 3.13.20, I am getting the following INFO messages from arrow:
The text was updated successfully, but these errors were encountered: