Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CameraIF Example Project on 78002 #981

Open
smtalds opened this issue Apr 4, 2024 · 33 comments
Open

CameraIF Example Project on 78002 #981

smtalds opened this issue Apr 4, 2024 · 33 comments

Comments

@smtalds
Copy link

smtalds commented Apr 4, 2024

Hi , I am working on MAX78002 Evkit. I worked on CameraIF with OV7692 camera, it was worked. Then i want to get input from GPIO ports. After that, i reconnect to camera to DVP port. I2c is read camera ID, tft init but camera_is_image_rcv is not returning. I checked on oscilloscope and i get signals. XVCLK and Pixclk is 7.5 MHz. I saw hsync and vsync. What is the problem now i dont know ? I fallow example project configuration for 78002.Maybe i changed drivers so i uninstalled MAXIM SDK and installed again. Still same.

Thanks for answering.

@Jake-Carter
Copy link
Contributor

Hi @smtalds, can you confirm that you have met all the required connections from the README?

Also, be aware of critical power sequencing requirements for the board documented in the quick-start guide. It's a possibility that back-powering can damage the camera modules.

There is also a known issue with the grey ribbon cable for the DVP connector. The pins can be easily stripped out if you remove it. See the note on step 4 of the OVM7692 section in the quick-start guide here. It's recommend to essentially never remove the cable once it's been inserted to avoid damage...

You can try the ImgCapture project as well.

@smtalds
Copy link
Author

smtalds commented Apr 5, 2024

Hi @Jake-Carter, yes i corfirmed required connections. If camera or dvp connector has damage i can not see signals on GPIO ports JH9&JH8. Am I right ?

I will try ImgCapture. Thank you for answering.

@smtalds
Copy link
Author

smtalds commented Apr 5, 2024

@Jake-Carter Hi again, I tried ImgCapture and i guess i have still same problem.

image

It is stuck here again. Can i use PCAM ? How can i confirm camera or dvp cable has damage ?
Update I checked DVP cable with multimeter shortcircuit and its ok. Can i reset camera / camera settings

@smtalds
Copy link
Author

smtalds commented Apr 15, 2024

@Jake-Carter Do you have any idea ? I guess camera has not get interrupts. How can i hardware reset camera ? Maybe i can applied 3.3V to Vref pins of camera. Because i wanted to get data from GPIO pins. I cant find datasheet of camera.

@Jake-Carter
Copy link
Contributor

@smtalds it's possible that the camera module could have some damage from the backpowering/power-on sequencing linked above.

Though it is a good sign that the ID can at least be read out successfully.

The camera datasheet is under NDA. You can try contacting OmniVision to obtain a copy. Try reading the following registers. These are a few system registers that will show whether the camera is in sleep mode, and whether the image resolution has been set successfully, etc. Here are the values off my camera after default initialization:

image

Commands:

get-reg 0x12
get-reg 0x13
get-reg 0x18
get-reg 0x1a

@smtalds
Copy link
Author

smtalds commented Apr 17, 2024

@Jake-Carter Hi again, i contacted with omnivision. I get registers from camera. This problem started when I tried to get data from GPIO ports JH6-9 from evkit. After that i cant get any data or interrupt on sdk. Maybe this info should be a good problem for the find the solution.
image

@Jake-Carter
Copy link
Contributor

Ok, it's a good sign that the I2C configuration bus is at least working. Can you send a picture of your setup?

@smtalds
Copy link
Author

smtalds commented Apr 22, 2024

WhatsApp Image 2024-04-22 at 08 28 50_d11f61a3
I guess i found the problem. I create a example camera data from fpga and sent to 78002. Then i changed setup of CameraIF example and i get interrupt. So i guess the camera module had some damage from the backpowering. (If there is no wrong on my setup)

@Jake-Carter
Copy link
Contributor

Thanks @smtalds, I agree camera damage is likely. The modules are somewhat sensitive, and the back-powering issue is not well exposed until you know about it...

Shoot me an email at jake.carter@analog.com and I will see if we have some internal stock of the camera modules we can ship you. We can also provide you a revised PICO debugger that eliminates the back-powering

@smtalds
Copy link
Author

smtalds commented Apr 26, 2024

@Jake-Carter I sent it. Are you going to close the issue? Maybe I have other questions.

@smtalds
Copy link
Author

smtalds commented Apr 26, 2024

@Jake-Carter Hi again, We figure out something, D0VDD on camera module is 1.8V but on camera.c

MXC_GPIO_SetVSSEL(gpio_cfg_pcif_vsync.port, MXC_GPIO_VSSEL_VDDIOH,
GPIO is setting "MXC_GPIO_VSSEL_VDDIOH" so 3.3V. Then we changed this setting to VDDIO == 1.8V, so data is came and we get frame on TFT screen. How can be possible ? Is there a problem on code ?

@Jake-Carter
Copy link
Contributor

Thanks @smtalds, good find. This is definitely a problem in the camera drivers... it has gone unnoticed for a long time.

I see that the absolute maximum rating for the OV7692 IO pins is DOVDD + 1V (2.8V in this case). Perhaps there is some variability between boards that causes some variability in VDDIOH so that the overvoltage is more/less severe. Strange that most of our boards have been working at VDDIOH for several years.

Regardless, we are exceeding the abs. max ratings of the sensor which can cause undefined failures like this. I will update our drivers to fix this.

I'm surprised this hasn't been seen until now... thanks for reporting it and investigating

@smtalds
Copy link
Author

smtalds commented Apr 30, 2024

You are welcome. Should i close the issue ? Maybe I may have other questions later.

@Jake-Carter
Copy link
Contributor

I will tag this ticket to be closed by the PR so we can keep it open

@smtalds
Copy link
Author

smtalds commented May 9, 2024

@Jake-Carter Hello again Jake :), I want to ask about hsync and vsync. I understand The framework is initialized when vsync is negedge. then I want to write a frame to TFT from another camera or fpga, should I define hsync, back porch or active area? Can I send all frame data to TFT when hsync is always high? Should I set hsync (320x2x240) bit high? I could not find detailed camera datasheet or QVGA timing area.

@Jake-Carter
Copy link
Contributor

Hello!

VSYNC is your "frame start"/"frame complete" signal from the camera. It's a short pulse signal from 0 -> 1.

HSYNC is set to 1 during a complete row of pixel data. It transitions back to 0 at the end of the row. So the edges of HSYNC define the boundaries of a row of pixels.

At the end of the entire image, there is another VSYNC pulse to signal the image is complete. This can also indicate the start of the next image depending on whether you're in continuous or single-shot capture mode.

This hardware-based signaling is the mode that our drivers operate in. See section 16.3.1 of the MAX78000 UG for some more details:

image

I think I might not fully understand your setup, but I hope the general guidelines are a useful starting point. The time between VSYNC signals defines your framerate, and your framerate defines the free bandwidth that you have to do other things in between each image. For example, 4fps -> 250ms between each frame, so you have 250ms to update the TFT with something else.

"Porching" is a camera setting that allows you to artificially extend the time between HSYNC and VSYNC signals by adding extra dummy "porch" bytes to the data. It's generally only useful for resolving lower-level timing issues with the camera interface.

Hope this helps, let me know if I can help clarify further or didn't understand correctly.

@smtalds
Copy link
Author

smtalds commented May 10, 2024

I understand. I should send data vsync is positive edge when hsync is high. My first byte is should be here. Am I right? Or after the negative edge of vsync should i send data?

image

If i send data when negative edge of vsync, we can say frame is started. So first row its okay, for second row, When hsync is set to 1, it will continue writing to the fifo, right? For first row (started of frame) we missed only one clock cycle hsync.

@Jake-Carter
Copy link
Contributor

@smtalds You should send data after the negative edge of VSYNC. The "frame start" signal only completes when VSYNC goes back to 0. VSYNC should remain high for 1 PCLK cycle for the start of the frame to be detected.

The first data byte will be sampled on the first rising edge of PCLK after the negative edge of VSYNC.

329479010-596d77e7-b5d8-47e4-a487-d3dfbb8a97af

When hsync is set to 1, it will continue writing to the fifo, right?

Yes, when HSYNC is 1 data will be sampled on each rising edge of PCLK and shifted into the FIFO.

HSYNC should transition from 1 -> 0 at the end of each row, then back from 0 -> 1 at the start of the next row. Unfortunately our timing diagrams don't show this part

@smtalds
Copy link
Author

smtalds commented May 14, 2024

Oh, I was sampling all data on each falling edge and i always missed one pixel (2byte) for each row.
image
So, i should start sampling data on the next rising edge after the vsync is going to low. There are a half clock cycle period i guess. Did I understand correctly?
image

@Jake-Carter
Copy link
Contributor

@smtalds yes, correct

@smtalds
Copy link
Author

smtalds commented May 27, 2024

Okay thank you, i guess i get it. Now i want to ask another problem. I want to generate face_db on FacialRecognation example on MAX78002. Then when i run the sh gen_db.sh, I got this error :
image

I guess file is not exist on drive link. I am working on ai8x-training env. Can you check drive link ?
image

@Jake-Carter
Copy link
Contributor

Hi @smtalds, I can replicate the error. We use the hawk-eyes package to obtain an implementation of the RetinaFace model, and it looks like the Google Drive link for the model file has been removed or moved. It's an internal error coming from the hawk-eyes package, which doesn't seem to be actively maintained anymore.

I will check into it to see if there is any info. We may need to look into alternative methods for running RetinaFace.

@Jake-Carter
Copy link
Contributor

Opened analogdevicesinc/ai8x-training#315

@smtalds
Copy link
Author

smtalds commented Jul 26, 2024

@Jake-Carter
Hi again, I want to ask a question about frame resolution for CNN. Image resolution is set 150x200. i used a mipi camera, but i can not set true align for 150. If i change resolution 150x200 to 160x200, is generate any problem for CNN?

@Jake-Carter
Copy link
Contributor

Hi @smtalds, the hardware can handle a 160x200x3 input layer. Just be sure you've enabled streaming mode.

Obviously you will also need to train the model itself for 160x120 input, but the hardware should not have any issues loading a frame at this resolution.

@smtalds
Copy link
Author

smtalds commented Aug 22, 2024

Hi again @Jake-Carter, i changed DVP camera with CSI camera. On CSI example, it using SRAM. So when i want to display the camera frame to ADAFRUIT display, display always refreshing because of MXC_TFT_Init();. They are using same SPI channel. Can i not use ADAFRUIT display ?

image

Thanks for answering.

@Jake-Carter
Copy link
Contributor

Hi @smtalds, the TFT shares the same SPI bus as the SRAM, but it requires different SPI settings and a software-controlled slave select.

So you can use them simultaneously, but you'll need to re-initialize the SPI hardware on the fly.

See camera_display_last_image in the pascalvoc_retinanet_v7_3 project for an example. This project uses both the SRAM and TFT simultaneously.

@Jake-Carter
Copy link
Contributor

@smtalds FYI I've also just merged some cleaner drivers for the APS6404 in #1095.

The project above still uses some local copies but in the future I will update them.

@smtalds
Copy link
Author

smtalds commented Aug 27, 2024

Hi @Jake-Carter , I'll check it out. i want to ask something. Why we need to write frame to SRAM? Can't we write memory? I tried to write memory with DMA. So i dont not need SPI for SRAM. Is this true way ?

@Jake-Carter
Copy link
Contributor

@smtalds for that specific example project, the CNN model requires some Non-Maximal Suppression (NMS) post-processing that takes up most of the available internal memory. So there isn't enough space to buffer the image and do the NMS.

If you have enough space you can absolutely store the frames into internal memory. This is actually more ideal since it will be much faster. This is why I exposed the CSI line handler to the application code so that you have the flexibility to store the data where you want/need.

@smtalds
Copy link
Author

smtalds commented Sep 3, 2024

@Jake-Carter Thanks for answering. I was wondering something about your CSI2 IP. How can i find this spec?mipi_camera.c

@Jake-Carter
Copy link
Contributor

@smtalds we have an internal document for it, but I'm not sure if we can share it. We may need to have an NDA in place.

Shoot me an email at jake.carter@analog.com if you'd like to start the process

@smtalds
Copy link
Author

smtalds commented Sep 5, 2024

@Jake-Carter I sent the mail. Can you check? Thanks for answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants