Skip to content

Refactor sendAndReceive #64

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 10 commits into
base: main
Choose a base branch
from
Draft

Refactor sendAndReceive #64

wants to merge 10 commits into from

Conversation

ryan-gang
Copy link
Contributor

@ryan-gang ryan-gang commented Jul 22, 2025

  • Added hex dump formatter and API key name resolver functions
  • Implemented concurrency-safe Kafka client with connect, send, and receive methods
  • Unified RequestHeader encoding and simplified request encoding
  • Updated DescribeTopicPartitionsRequest encoding method for improved clarity
  • Replaced Broker with Client for Kafka connection and request handling

Summary by CodeRabbit

  • New Features

    • Introduced a new Kafka client for connecting, sending, and receiving protocol requests with enhanced error handling and logging.
    • Added utility functions for generating formatted hex dumps and mapping API keys to their names.
  • Refactor

    • Updated request encoding and sending logic to use the new Kafka client, simplifying request handling and resource management.
    • Streamlined encoding methods for protocol requests and headers for improved consistency.
  • Documentation

    • Added comments indicating planned deprecation of certain utility functions.

Copy link

coderabbitai bot commented Jul 22, 2025

Warning

Rate limit exceeded

@ryan-gang has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 11 minutes and 34 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between 2bc92ec and 834228f.

📒 Files selected for processing (1)
  • protocol/kafka_client/client.go (1 hunks)

"""

Walkthrough

The changes introduce a new Kafka client implementation, update request encoding logic, and refactor related APIs. The new client manages connections, sending, and receiving Kafka protocol messages, with enhanced error handling and logging. Request encoding is simplified and made self-contained. Utility functions for hexdump formatting and API key naming are added, and interface definitions are updated to support the new client.

Changes

File(s) Change Summary
internal/stage_dtp2.go Switched from protocol.Broker to kafka_client.Client for Kafka broker interaction; updated connection, send/receive, and cleanup logic. Removed raw hexdump logging.
internal/utils.go Added a comment indicating future deprecation of GetFormattedHexdump.
protocol/api/describe_topic_partitions.go Simplified EncodeDescribeTopicPartitionsRequest to directly call the request's Encode() method.
protocol/api/describe_topic_partitions_request.go Refactored DescribeTopicPartitionsRequest: removed old Encode, added self-contained Encode and GetHeader methods.
protocol/api/header.go Changed EncodeV2 to value receiver and unexported; added exported Encode method that wraps EncodeV2.
protocol/builder/interface.go Introduced RequestI interface with Encode() and GetHeader() methods.
protocol/kafka_client/client.go Added new Kafka client implementation: connection management, send/receive, concurrency, error handling, logging, and response struct.
protocol/utils/utils.go Added new utility functions: GetFormattedHexdump for hex dumps, APIKeyToName for API key string mapping.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Stage as stage_dtp2.go
    participant Client as kafka_client.Client
    participant Broker as Kafka Broker

    User->>Stage: Initiate request
    Stage->>Client: NewClient("localhost:9092")
    Stage->>Client: ConnectWithRetries()
    Client->>Broker: Establish TCP connection
    Stage->>Client: SendAndReceive(request, logger)
    Client->>Broker: Send encoded request bytes
    Broker-->>Client: Return response bytes
    Client-->>Stage: Return Response struct
    Stage->>Client: Close()
    Client-->>Stage: Connection closed
Loading

Estimated code review effort

3 (~45 minutes)

Poem

In burrows deep, the code did grow,
A Kafka client, swift to show—
With hex and keys and headers neat,
Requests now travel, self-contained and fleet.
The broker hops in bytes anew,
While rabbits cheer this coding coup!
🐇✨
"""

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

cursor[bot]

This comment was marked as outdated.

@ryan-gang ryan-gang self-assigned this Jul 22, 2025
@@ -3,19 +3,12 @@ package kafkaapi
import (
"github.com/codecrafters-io/kafka-tester/protocol"
"github.com/codecrafters-io/kafka-tester/protocol/decoder"
"github.com/codecrafters-io/kafka-tester/protocol/encoder"
"github.com/codecrafters-io/kafka-tester/protocol/errors"
"github.com/codecrafters-io/tester-utils/logger"
)

func EncodeDescribeTopicPartitionsRequest(request *DescribeTopicPartitionsRequest) []byte {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will be removed.

@ryan-gang ryan-gang requested a review from rohitpaulk July 22, 2025 05:38
cursor[bot]

This comment was marked as outdated.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (3)
protocol/utils/utils.go (1)

50-65: Consider more graceful error handling for unknown API keys.

While the function correctly maps known API keys to names, using panic for unknown keys might be too harsh. Consider returning an error or a default string like "Unknown API Key (X)" to make the code more robust.

-func APIKeyToName(apiKey int16) string {
+func APIKeyToName(apiKey int16) string {
 	switch apiKey {
 	case 0:
 		return "Produce"
 	case 1:
 		return "Fetch"
 	case 18:
 		return "ApiVersions"
 	case 19:
 		return "CreateTopics"
 	case 75:
 		return "DescribeTopicPartitions"
 	default:
-		panic(fmt.Sprintf("CodeCrafters Internal Error: Unknown API key: %v", apiKey))
+		return fmt.Sprintf("Unknown API Key (%d)", apiKey)
 	}
 }

However, if this is specifically for a testing framework where unknown API keys indicate programming errors, the current panic approach may be acceptable.

protocol/builder/interface.go (1)

5-8: Consider Go naming convention for interfaces.

The interface design is solid and provides a clean abstraction for request handling. However, Go convention typically avoids the 'I' suffix for interfaces. Consider renaming to just Request or RequestEncoder.

-type RequestI interface {
+type Request interface {
 	Encode() []byte
 	GetHeader() kafkaapi.RequestHeader
 }
protocol/kafka_client/client.go (1)

197-221: Improve error handling structure.

The empty else block is unnecessary. Consider restructuring the error handling for better readability.

 	if err != nil && err != io.EOF {
 		if netErr, ok := err.(net.Error); ok && netErr.Timeout() {
-		} else {
+			// Timeout is expected, return what we have
+			return buf.Bytes(), nil
+		}
-			return nil, fmt.Errorf("error reading from connection: %v", err)
-		}
+		return nil, fmt.Errorf("error reading from connection: %v", err)
 	}
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bbb36bd and 0e2f0c8.

📒 Files selected for processing (8)
  • internal/stage_dtp2.go (3 hunks)
  • internal/utils.go (1 hunks)
  • protocol/api/describe_topic_partitions.go (1 hunks)
  • protocol/api/describe_topic_partitions_request.go (1 hunks)
  • protocol/api/header.go (1 hunks)
  • protocol/builder/interface.go (1 hunks)
  • protocol/kafka_client/client.go (1 hunks)
  • protocol/utils/utils.go (1 hunks)
🧬 Code Graph Analysis (5)
protocol/builder/interface.go (1)
protocol/api/header.go (1)
  • RequestHeader (12-21)
protocol/utils/utils.go (1)
internal/utils.go (1)
  • GetFormattedHexdump (25-65)
protocol/api/describe_topic_partitions_request.go (2)
protocol/api/header.go (1)
  • RequestHeader (12-21)
protocol/encoder/encoder.go (1)
  • Encoder (13-16)
internal/stage_dtp2.go (1)
protocol/kafka_client/client.go (2)
  • NewClient (39-41)
  • Client (31-35)
protocol/kafka_client/client.go (3)
internal/kafka_executable/kafka_executable.go (1)
  • KafkaExecutable (14-18)
protocol/builder/interface.go (1)
  • RequestI (5-8)
protocol/utils/utils.go (1)
  • APIKeyToName (50-65)
🧰 Additional context used
🧬 Code Graph Analysis (5)
protocol/builder/interface.go (1)
protocol/api/header.go (1)
  • RequestHeader (12-21)
protocol/utils/utils.go (1)
internal/utils.go (1)
  • GetFormattedHexdump (25-65)
protocol/api/describe_topic_partitions_request.go (2)
protocol/api/header.go (1)
  • RequestHeader (12-21)
protocol/encoder/encoder.go (1)
  • Encoder (13-16)
internal/stage_dtp2.go (1)
protocol/kafka_client/client.go (2)
  • NewClient (39-41)
  • Client (31-35)
protocol/kafka_client/client.go (3)
internal/kafka_executable/kafka_executable.go (1)
  • KafkaExecutable (14-18)
protocol/builder/interface.go (1)
  • RequestI (5-8)
protocol/utils/utils.go (1)
  • APIKeyToName (50-65)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Cursor BugBot
🔇 Additional comments (7)
protocol/utils/utils.go (1)

8-48: LGTM! Well-implemented hex dump formatter.

The function correctly formats byte data into a readable hex dump with proper alignment, ASCII representation, and padding. The use of strings.Builder for efficient string concatenation is a good practice.

internal/utils.go (1)

23-24: Good practice to document the deprecation and migration path.

The TODO comment clearly indicates the planned removal and replacement, which will help with the refactoring process.

protocol/api/describe_topic_partitions.go (1)

10-12: Good simplification that aligns with the interface-driven refactoring.

The change to delegate encoding to the request's own Encode() method makes the code cleaner and more self-contained, supporting the new RequestI interface pattern.

internal/stage_dtp2.go (3)

9-9: Import statement correctly added for the new Kafka client.

The import aligns with the PR's objective to replace the Broker component with the new Client implementation.


28-34: Connection logic successfully migrated to the new Kafka client.

The changes improve connection handling with built-in retry logic and proper resource cleanup. The integration with the Kafka executable for monitoring is a good enhancement.


48-48: Request sending properly updated to use the new client interface.

The change from manual encoding to structured request handling improves code maintainability and aligns with the new architecture.

protocol/kafka_client/client.go (1)

111-136: Well-implemented request/response orchestration.

The method provides excellent logging and clean separation of concerns between sending and receiving operations.

cursor[bot]

This comment was marked as outdated.

cursor[bot]

This comment was marked as duplicate.

Copy link
Member

@rohitpaulk rohitpaulk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are a bunch of things mixed in here. There's renaming broker -> client, there's changing the interface of SendAndReceive, and there's moving GetFormattedHexDump to utils, there's a bunch of interface stuff.

Split out, and try to avoid in-progress comments for things that can clearly be done in one PR, like migrating usages of GetFormattedHexDump

@ryan-gang ryan-gang marked this pull request as draft July 24, 2025 07:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants