Skip to content

Conversation

samh-efx
Copy link

@enjoy-digital Hi Florent, can you help try out this port through synthesis?

It works fine with vexriscv and vexriscv-smp (with the --with-wishbone-memory flag) so the wishbone-to-axi path using axi point2point interconnect is fine.

With cores like Rocket or Naxriscv it fails some bit-slicing assertion during the fhdl generation phase;
When I force comment out those for try out, it fails during synthesis due to some slicing out of range.
So I'm guessing I did some of the address width wrong for the Wishbone2AXI,
or the way I did bus region decoding for slaves of AXICrossbar is wrong ?

@enjoy-digital
Copy link
Member

Hi @samh-efx,

I indeed reproduced the issue. Instead of searching the real reasons, I did some improvements to LiteX to simplify creating additionnal interconnects with: enjoy-digital/litex@13448b8

This now allow your custom interconnect to be described like this:

        xbar_slaves = self.add_sdram_io(platform)
        self.xbar_bus = SoCBusHandler(
            name             = "SoCXBARBusHandler",
            standard         = "axi",
            data_width       = 512,
            address_width    = 32,
            bursting         = True
        )
        for master in xbar_masters:
            self.xbar_bus.add_master(master=master)
        self.xbar_bus.add_slave("main_ram", slave=xbar_slaves["main_ram"], region=SoCRegion(origin=0x00000000, size=0x100000000)) # FIXME: covers lower 4GB only
        self.xbar_bus.finalize()

With this, ./efinix_titanium_ti180_m484_dev_kit.py --cpu-type=naxriscv --build is still building here and hasn't crashed yet :)

@enjoy-digital
Copy link
Member

enjoy-digital commented Nov 14, 2022

@samh-efx: The best for your design would probably to setup a simulation environment with LiteX (similar to what we are doing with litex_sim but with LiteDRAM generated as a standalone core, integratiing the simulation model (--sim) and with an AXI user port. This would allow you to do a simulation and of the different CPUs with Verilator and verify that the generated logic is correct before testing on hardware. I have limited time to do this currently, but please contact me directly if would be interested to speed this up and get more support on this.

Comment on lines +64 to +69
("user_led", 0, Pins("E1"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led2 GPIOB_N_02
("user_led", 1, Pins("F1"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led3 GPIOB_P_02
("user_led", 2, Pins("C2"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led4 GPIOB_P_13
("user_led", 3, Pins("E3"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led5 GPIOB_P_14
("user_led", 4, Pins("B1"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led6 GPIOB_N_11
("user_led", 5, Pins("B2"), IOStandard("3.3_V_LVTTL_/_LVCMOS"), Misc("DRIVE_STRENGTH=3")), # led7 GPIOB_P_12
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On Titanium, this needs to be spelled 3.3_V_LVCMOS and have a multiple of two DRIVE_STRENGTH, right?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, for that matter, the bank these are on is 1.8V on Ti180J484 EVK; is that the same, or different, on M484 EVK? It looks like most of the other pins are the same.


if args.flash:
from litex.build.openfpgaloader import OpenFPGALoader
prog = OpenFPGALoader("titanium_ti180_m484")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This board isn't upstream yet, is it? If this works for you, any chance I could get early access to it?

@liqihong8
Copy link

Hi @samh-efx,
I found the following error while executing the build command.
litex-boards/litex_boards/targets/efinix_titanium_ti180_m484_dev_kit.py --build

INFO: Save design...done.
[WARNING ] Setting programming model to default setting of mode: active, width: 1
ERROR: Interface Designer project file design check has found errors.
Open the Interface Designer tool for details.
Skipping Interface Designer
Traceback (most recent call last):
File "litex-boards/litex_boards/targets/efinix_titanium_ti180_m484_dev_kit.py", line 768, in
main()
File "litex-boards/litex_boards/targets/efinix_titanium_ti180_m484_dev_kit.py", line 511, in main
builder.build()
File "/root/my-venv/work/python-litex-20250311/litex/litex/soc/integration/builder.py", line 415, in build
vns = self.soc.build(build_dir=self.gateware_dir, **kwargs)
File "/root/my-venv/work/python-litex-20250311/litex/litex/soc/integration/soc.py", line 1498, in build
return self.platform.build(self, *args, **kwargs)
File "/root/my-venv/work/python-litex-20250311/litex/litex/build/efinix/platform.py", line 70, in build
return self.toolchain.build(self, *args, **kwargs)
File "/root/my-venv/work/python-litex-20250311/litex/litex/build/efinix/efinity.py", line 76, in build
return GenericToolchain.build(self, platform, fragment, **kwargs)
File "/root/my-venv/work/python-litex-20250311/litex/litex/build/generic_toolchain.py", line 123, in build
self.run_script(script)
File "/root/my-venv/work/python-litex-20250311/litex/litex/build/efinix/efinity.py", line 352, in run_script
raise OSError("Error occurred during efx_run_pt execution.")
OSError: Error occurred during efx_run_pt execution.

When I execute the following build command, there are no errors reported, which indicates that my build environment is correct.
litex-boards/litex_boards/targets/efinix_ti375_c529_dev_kit.py --build

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants