Skip to content

Conversation

@devmam999
Copy link
Contributor

No description provided.

Copy link
Member

@fdxmw fdxmw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be good to test this new feature, maybe we can modify one of the existing tests to exercise module_name?

@gaborszita
Copy link
Contributor

Great PR I also wanted this feature

Fix formatting of module declaration and add module_name parameter.
@devmam999
Copy link
Contributor Author

Hey, I have updated my branch. Please check if anything wrong still persists

@fdxmw
Copy link
Member

fdxmw commented Oct 25, 2025

It would be good to test this new feature, maybe we can modify one of the existing tests to exercise module_name?

Please ensure that there are tests for your new code.

@devmam999
Copy link
Contributor Author

Hey, I updated my code again adn this time I made sure it passed all the tests. I hope it works now!

@devmam999 devmam999 force-pushed the feature/module_name_param branch from dff6a32 to 593d1e6 Compare October 28, 2025 00:50
@devmam999 devmam999 force-pushed the feature/module_name_param branch from 593d1e6 to 9ee7b78 Compare October 28, 2025 00:52
@gaborszita
Copy link
Contributor

@devmam999 btw there are a few Verilog files seems you used for testing in the parent directory on your branch - did you accidentally commit these?

@codecov
Copy link

codecov bot commented Oct 30, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.5%. Comparing base (c9a7b10) to head (c252752).
⚠️ Report is 1 commits behind head on development.

Additional details and impacted files
@@              Coverage Diff              @@
##           development    #474     +/-   ##
=============================================
+ Coverage         91.5%   91.5%   +0.1%     
=============================================
  Files               25      25             
  Lines             7089    7090      +1     
=============================================
+ Hits              6486    6487      +1     
  Misses             603     603             

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@fdxmw
Copy link
Member

fdxmw commented Oct 30, 2025

@devmam999 btw there are a few Verilog files seems you used for testing in the parent directory on your branch - did you accidentally commit these?

I agree, Verilog files do not belong in this pull request.

Please also see my earlier comments about adding documentation and testing for your new feature to this pull request.

@devmam999
Copy link
Contributor Author

To be clear, with this latest update on my branch, module_name is part of the verilog output class so it should be able to support custom names

@fdxmw
Copy link
Member

fdxmw commented Nov 4, 2025

To be clear, with this latest update on my branch, module_name is part of the verilog output class so it should be able to support custom names

Your current changes to _VerilogOutput look good to me, but these changes won't be available to users calling pyrtl.output_to_verilog or pyrtl.output_verilog_testbench because these functions still instantiate _VerilogOutput without passing module_name to its constructor.

So this is a great start, but before we can merge this PR we'll need to wrap up a few details:

  1. Update pyrtl.output_to_verilog and pyrtl.output_verilog_testbench to add a module_name parameter. This will just pass through to _VerilogOutput's new module_name parameter.
  2. Update the documentation for pyrtl.output_to_verilog and pyrtl.output_verilog_testbench to document the new module_name parameter. Your updated documentation should be visible on your PR's generated documentation.
  3. Add new tests, or update existing tests, to verify that pyrtl.output_to_verilog and pyrtl.output_verilog_testbench work correctly when the new module_name parameter is set. Your new/updated tests should fail with the PR in its current state, since it currently does not modify these functions.

@devmam999
Copy link
Contributor Author

"Add new tests, or update existing tests, to verify that pyrtl.output_to_verilog and pyrtl.output_verilog_testbench work correctly when the new module_name parameter is set. Your new/updated tests should fail with the PR in its current state, since it currently does not modify these functions."

So I'm confused, you want some of my tests to fail?

@devmam999
Copy link
Contributor Author

Also I am confused as to why you want two of my functions to have module_name as a parameter after I said I wanted to use the class's parameter

@fdxmw
Copy link
Member

fdxmw commented Nov 4, 2025

Also I am confused as to why you want two of my functions to have module_name as a parameter after I said I wanted to use the class's parameter

Sorry for the confusion! I'll try to clear things up. First, let's make sure we agree that there are two different things here that are both called output_to_verilog.

  1. pyrtl.importexport.output_to_verilog. This is a top-level function in the importexport module. This function is given the alias pyrtl.output_to_verilog by pyrtl/__init__.py. Users directly call pyrtl.output_to_verilog to export their PyRTL code to Verilog. This function is currently not modified by your PR. It will need to be modified by your PR for users to benefit from your changes.
  2. _VerilogOutput.output_to_verilog. This is a helper method in the _VerilogOutput class that's called by pyrtl.output_to_verilog. Users do not call this method. Your PR currently modifies this helper method's behavior.

There is a similar distinction between pyrtl.output_verilog_testbench and _VerilogOutput.output_verilog_testbench.

Your PR currently adds your new module_name feature to the _VerilogOutput.output_to_verilog and _VerilogOutput.output_verilog_testbench helper methods, but does not actually update pyrtl.output_to_verilog or pyrtl.output_verilog_testbench to allow users to use your new feature.

I am requesting that you update pyrtl.output_to_verilog and pyrtl.output_verilog_testbench in your PR.

"Add new tests, or update existing tests, to verify that pyrtl.output_to_verilog and pyrtl.output_verilog_testbench work correctly when the new module_name parameter is set. Your new/updated tests should fail with the PR in its current state, since it currently does not modify these functions."

So I'm confused, you want some of my tests to fail?

I am requesting tests in this PR that verify that pyrtl.output_to_verilog and pyrtl.output_verilog_testbench work correctly with your new module_name feature. Your new/updated tests must pass before we can merge your PR.

With your PR in its current state, your new/updated test should not pass because pyrtl.output_to_verilog and pyrtl.output_verilog_testbench have not yet been updated to use your new feature.

I recommend verifying this for yourself by writing the test first, verifying that it fails, then updating pyrtl.output_to_verilog and pyrtl.output_verilog_testbench to use your new feature, then verifying that the test passes.

But that's just my recommendation, you can do whatever you want, as long as you update pyrtl.output_to_verilog and pyrtl.output_verilog_testbench, and test that they work with module_name.

Hope this clears things up, let me know if you have more questions!

@devmam999
Copy link
Contributor Author

Alright, thanks a lot. I have updated my file again I hope it's good now

@fdxmw
Copy link
Member

fdxmw commented Nov 7, 2025

Alright, thanks a lot. I have updated my file again I hope it's good now

Your changes to importexport.py look good to me, but we still need tests that verify that pyrtl.output_to_verilog and pyrtl.output_verilog_testbench work correctly with your new module_name feature. Did you forget to add them to your PR?

@devmam999
Copy link
Contributor Author

Remind me how to add the tests? I remember last time I added test files you told me to remove them.

@fdxmw
Copy link
Member

fdxmw commented Nov 9, 2025

Remind me how to add the tests? I remember last time I added test files you told me to remove them.

In software projects, "tests" generally refer to code that checks whether another piece of code works properly. In Python, this is usually done with the unittest module. Please read the documentation for unittest, which explains the idea in more detail and provides some simple examples.

Tests are crucial for projects with many contributors because:

  1. Tests are easy to run. You don't have to be a Simulation expert to run the Simulation tests, anyone can just run test_simulation.py to see if Simulation still works properly.
  2. All developers working on the project are expected to run the tests, so they give developers a strong and immediate signal when they break something.
  3. Tests are your only reasonable defense against someone else accidentally breaking your code in the future. You will write a lot of code over your lifetime, and you can't keep an eye on every line to make sure nobody else messes it up. We all rely on tests to do this work for us.

A generated Verilog file is not really a test, because the reader has to know something about Verilog to tell if the generated Verilog file is correct or not. Also, a generated Verilog file will not re-generate itself when someone changes output_to_verilog in the future.

PyRTL has many tests in the tests directory. There are existing tests for export_to_verilog and export_verilog_testbench in test_importexport.py.

You can decide whether your feature deserves its own new tests in test_importexport.py, or if existing tests in test_importexport.py can be extended to test your new feature.

Hope this makes sense, let me know if you have more questions!

@devmam999
Copy link
Contributor Author

Interesting, thanks for the clarifying comment

@devmam999
Copy link
Contributor Author

Alright, I have added my test

Copy link
Member

@fdxmw fdxmw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your new test looks good, here are some minor comments. There are also some lint errors identified by the test automation.

Next time, please run uv run just presubmit to check for lint errors before requesting another review. See this announcement for background on the new uv-based developer workflow.

After these comments and lint errors are addressed, I think we can merge this pull request. Thank you for your patience!

@fdxmw
Copy link
Member

fdxmw commented Nov 14, 2025

Thank you for addressing my comments! We are almost there, you'll need to fix the formatting and lint errors identified by the pull request checks before we can merge. See my earlier comment.

@devmam999
Copy link
Contributor Author

I've tried doing that but I get this error

"Error: No justfile found"

@fdxmw
Copy link
Member

fdxmw commented Nov 15, 2025

I've tried doing that but I get this error

"Error: No justfile found"

It sounds like your fork is not up to date. Try syncing your fork, and make sure it includes includes commit 1c06c9e

@devmam999
Copy link
Contributor Author

Alright thanks. The formatting errors should hopefully be fixed now

@fdxmw fdxmw merged commit 2a1aa8a into UCSBarchlab:development Nov 17, 2025
4 checks passed
@fdxmw fdxmw linked an issue Nov 17, 2025 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

"toplevel" in verilog output is hardcoded

4 participants