Path to.NET Core Micro Services: Let's modify the last Demo communication to complete RPC communication

Keywords: encoding JSON Zookeeper

There have been some delays in updating these days. Sorry, everyone.
In the previous article, we briefly introduced the DotNetty communication framework and the implementation of loop (Echo) communication based on DotNetty.
Let's recall the entire process of the previous project:
  1. When the server starts, bind and listen on (READ) set ports, such as 1889.
  2. When the client starts, bind the specified port and wait for user input.
  3. When the user enters any string data, the client transcodes the data into byte format and transfers it to the server.
  4. When the server receives the data from the client, it transcodes the data, outputs the console, and returns the data to the client again.
  5. The client receives the data and prints it.

 

A simple peer-to-peer communication example is implemented.Next we will make a simple modification to this DEMO to simulate a construction process for the simplest gRPC communication.
 
This article is very simple, as long as you implement the last demo and make a few modifications, you will be able to implement gRPC (of course, building gRPC is not that easy at all). This article also comes with a lightweight RPC framework developed in the past few days, followed by an example.
 

Server

Add two static methods, SayHello and SayByebye, to provide remote calls, which are super simple and unexplained.

public static class Say
{
    public static string SayHello(string content)
    {
        return $"hello {content}";
    }

    public static string SayByebye(string content)
    {
        return $"byebye {content}";
    }
}

 

 

In our original ChannelRead function, the original Echo loop transmission is directly replaced by the following.

 1 public override void ChannelRead(IChannelHandlerContext context, object message)
 2 {
 3     if (message is IByteBuffer buffer)
 4     {
 5         Console.WriteLine($"message length is {buffer.Capacity}");
 6         var obj = JsonConvert.DeserializeObject<Dictionary<string, string>>(buffer.ToString(Encoding.UTF8).Replace(")", "")); // (1)
 7 
 8         byte[] msg = null;
 9         if (obj["func"].Contains("sayHello"))  // (2)
10         {
11             msg = Encoding.UTF8.GetBytes(Say.SayHello(json["username"]));
12         }
13 
14         if (obj["func"].Contains("sayByebye")) // (2)
15         {
16             msg = Encoding.UTF8.GetBytes(Say.SayByebye(json["username"]));
17         }
18 
19         if (msg == null) return;
20         // Set up Buffer Size
21         var b = Unpooled.Buffer(msg.Length, msg.Length); // (3)
22         IByteBuffer byteBuffer = b.WriteBytes(msg); // (4)
23         context.WriteAsync(byteBuffer); // (5)
24     }
25 }

 

 

(1): With the phrase Replace(")", ""), I don't know why there will always be a left parenthesis inside every time a string escaped from a buffer is sent, maybe a message header, maybe a marker header for protobuf-net, because it is in byte format, and there is no further deserialization of protobuf if you are lazy on the service side.
Why use Dictionary as an intermediate object conversion, because serialization requires entity objects as types, and for a simple introduction to RPC, this is now the case, as shown in the code above.
(2): Make a method call by judging what is in the "func" field and convert the returned result of the calling process to BYTE format.
(3): Set the Buffer size in this transmission.
(4): Write a message (data) to a Buffer in DotNetty.
(5): Finally, the Buffer is written to the current context (including channels, transfer objects, connection objects, and so on).
 

Client

We will make the following modifications to EchoClientHandler in the previous demo to complete a simple request

 1 public EchoClientHandler()
 2 {
 3     var hello = new Dictionary<string, string> // (1)
 4     {
 5         {"func", "sayHello"},
 6         {"username", "stevelee"}
 7     };
 8     SendMessage(ToStream(JsonConvert.SerializeObject(hello)));
 9 }
10 
11 private byte[] ToStream(string msg)
12 {
13     Console.WriteLine($"string length is {msg.Length}");
14     using (var stream = new MemoryStream()) // (2)
15     {
16         Serializer.Serialize(stream, msg);
17         return stream.ToArray();
18     }
19 }
20 
21 private void SendMessage(byte[] msg)
22 {
23     Console.WriteLine($"byte length is {msg.Length}");
24     _initialMessage = Unpooled.Buffer(msg.Length, msg.Length);
25     _initialMessage.WriteBytes(msg); // (3)
26 }

 

(1): Establish communication data related to the server.
(2): Serialize the data into a binary stream.
(3): Write data to ByteBuffer.
 

Start

Because the use of the sayHello method is explicitly marked on the client side, the client will receive a "hello stevelee" returned by the server side.

The simplest RPC remote call is done (in fact, the last one belongs to RPC, but here the method and filter are used to specify the call).

 

Questions

  1. It's impossible for a server to invoke a method in this clumsy way of filtering?Yes, this is only a DEMO, to demonstrate and understand the basic concepts, but to move a dynamic proxy to implement the method Invoke.
  2. This DEMO is only a point-to-point remote call and does not involve any advanced features such as service routing and forwarding.
  3. When there are new interfaces, they need to be recompiled and exposed. If there are tens of thousands of new interfaces, isn't it crazy to do this duplicate work.
  4. ...etc
Here's a recommendation for a recently built framework: Easy.Rpc Connection Point Me ), implements routing, forwarding, proxy, dynamic compilation features.Here's also a place to recommend a DotNetty-based RPC framework for your friends. Connection Point Me ) Team Zhang recommended me to join them, but I don't know how to join their team. Sorry, hurry up.
 
This article does not detail how this framework is implemented. It is estimated to be hundreds of thousands of words. It will be better to screw out a series separately. What principles are needed for the framework design and the issues to be considered, including design patterns, dependency injection, dynamic proxy, dynamic compilation, routing and forwarding.
 

Esay.Rpc

As mentioned above, if you need to solve these problems, you need to modify a lot of things.
 
For example, changing the function to an interface, placing the definition of the interface on the server side and opening the corresponding ports to the outside world, placing the implementation of the interface on the server side as well, providing calls to the interface, and making remote interface calls by clients in an API-like manner, so the definition of this interface must be a single-column item;
How to deploy (expose) interfaces automatically, through an intermediate coordinator (also known as a service registry, such as ETCD, consul, zookeeper), and how to register these interfaces automatically to the service center, requires reflection auto-scan and add to the registry.
 
We add an intermediate common library for Rpc.Common, and of course the framework source for Easy.Rpc is also there (the framework is not currently discussed), adding the IUserService interface, the UserModel entity class, and the UserServiceImpl implementation class.In fact, the generic class library only needs interfaces and entities, and the interface implementation completely places the server so that the library can be completely separated.(However, the author has been lazy to write in the Rpc.Common library, the actual production can never be so membrane, separation, separation, which is also one of the main concepts of micro-service)
 
The DEMO is structured as follows (Easy.Rpc source code is currently included in this, and will be picked up separately in two days to form a framework for easy invocation)

 

 

First look at what the interface defines:

 1 /// <summary>
 2 /// Interface UserService Definition
 3 /// </summary>
 4 [RpcTagBundle]
 5 public interface IUserService
 6 {
 7     Task<string> GetUserName(int id);
 8 
 9     Task<int> GetUserId(string userName);
10 
11     Task<DateTime> GetUserLastSignInTime(int id);
12 
13     Task<UserModel> GetUser(int id);
14 
15     Task<bool> Update(int id, UserModel model);
16 
17     Task<IDictionary<string, string>> GetDictionary();
18 
19     Task Try();
20 
21     Task TryThrowException();
22 }
Eight interfaces, which cover almost all the current method scenarios for RPC call testing.Interface implementations don't stick. You can customize any implementation of an interface, or you can just say Console.Write.
There is a UserModel entity object in the interface parameter, which is also pasted here.
1 [ProtoContract]
2 public class UserModel
3 {
4     [ProtoMember(1)] public string Name { get; set; }
5 
6     [ProtoMember(2)] public int Age { get; set; }
7 }

 

There are two different tags above, which are unique to protobuf-net.

ProtoContract tag: This class is the data class that participates in serialization of content.
ProtoMember Title: This class requires a serialized field and order.

 

protobuf-net pit

  1. There is no inheritance of this class in the default example, so there will not be a monster problem, but if UserModel is a subclass and it inherits from a parent that also has multiple subclasses, direct ProtoContract participation in serialization will result in an error, and DataMemberOffset = needs to be added to the attributeX, where x is not a letter, but a serialization order of this subclass.For example, if three subclasses inherit the same parent class, and the offsets of the first two subclasses are 1 and 2, then the offset of this class will be set to 3, and so on.
  2. Of the default data types, there is no problem with the standard type defined by the system, but it would also be a nightmare to have an array type like int[] of a monster. The official website team did not explain why serialization of arrays is not supported. I guess it was abandoned because of the irregularity of the array (such as multidimensional arrays, even irregular multidimensional arrays), after all, the orderColumnization does not affect performance.

Next, continue with the server-side code

 1 static void Main()
 2 {
 3     var bTime = DateTime.Now;
 4 
 5     // Automatic assembly
 6     var serviceCollection = new ServiceCollection();
 7     {
 8         serviceCollection
 9             .AddLogging()
10             .AddRpcCore()
11             .AddService()
12             .UseSharedFileRouteManager("d:\\routes.txt")
13             .UseDotNettyTransport();
14 
15         // ** Injecting local test classes
16         serviceCollection.AddSingleton<IUserService, UserServiceImpl>();
17     }
18 
19     // Build the current container
20     var buildServiceProvider = serviceCollection.BuildServiceProvider();
21 
22     // Get Service Management Entity Class
23     var serviceEntryManager = buildServiceProvider.GetRequiredService<IServiceEntryManager>();
24     var addressDescriptors = serviceEntryManager.GetEntries().Select(i => new ServiceRoute
25     {
26         Address = new[]
27         {
28             new IpAddressModel {Ip = "127.0.0.1", Port = 9881}
29         },
30         ServiceDescriptor = i.Descriptor
31     });
32     var serviceRouteManager = buildServiceProvider.GetRequiredService<IServiceRouteManager>();
33     serviceRouteManager.SetRoutesAsync(addressDescriptors).Wait();
34 
35     // Build internal log processing
36     buildServiceProvider.GetRequiredService<ILoggerFactory>().AddConsole((console, logLevel) => (int) logLevel >= 0);
37 
38     // Get Service Host
39     var serviceHost = buildServiceProvider.GetRequiredService<IServiceHost>();
40 
41     Task.Factory.StartNew(async () =>
42     {
43         //Start Host
44         await serviceHost.StartAsync(new IPEndPoint(IPAddress.Parse("127.0.0.1"), 9881));
45     });
46 
47     Console.ReadLine();
48 }
The entire process is based on serviceCollection to automate assembly and construction, and I believe you can understand the implications of these dependencies on injection and automated service building using the Ioc container.
Add client code:
 1 static void Main()
 2 {
 3     var serviceCollection = new ServiceCollection();
 4     {
 5         serviceCollection
 6             .AddLogging()                                // Add Log
 7             .AddClient()                                 // Add Client
 8             .UseSharedFileRouteManager(@"d:\routes.txt") // Add Shared Route
 9             .UseDotNettyTransport();                     // Add to DotNetty Communication transmission
10     }
11 
12     var serviceProvider = serviceCollection.BuildServiceProvider();
13 
14     serviceProvider.GetRequiredService<ILoggerFactory>().AddConsole((console, logLevel) => (int) logLevel >= 0);
15 
16     var services = serviceProvider.GetRequiredService<IServiceProxyGenerater>()
17         .GenerateProxys(new[] {typeof(IUserService)}).ToArray();
18 
19     var userService = serviceProvider.GetRequiredService<IServiceProxyFactory>().CreateProxy<IUserService>(
20         services.Single(typeof(IUserService).GetTypeInfo().IsAssignableFrom)
21     );
22 
23     while (true)
24     {
25         Task.Run(async () =>
26         {
27             Console.WriteLine($"userService.GetUserName:{await userService.GetUserName(1)}");
28             Console.WriteLine($"userService.GetUserId:{await userService.GetUserId("rabbit")}");
29             Console.WriteLine($"userService.GetUserLastSignInTime:{await userService.GetUserLastSignInTime(1)}");
30             var user = await userService.GetUser(1);
31             Console.WriteLine($"userService.GetUser:name={user.Name},age={user.Age}");
32             Console.WriteLine($"userService.Update:{await userService.Update(1, user)}");
33             Console.WriteLine($"userService.GetDictionary:{(await userService.GetDictionary())["key"]}");
34             await userService.Try();
35             Console.WriteLine("client function completed!");
36         }).Wait();
37         Console.ReadKey();
38     }
39 }

 

I want to see here, understand the role of the code above, and understand the role of this framework. Clients can call remote methods just like local methods, and the intermediate processes are completely transparent, detached, detached, and detached.
The role of microservices is not described anymore.
 
 
Thanks for reading!

Posted by TeddyKiller on Tue, 21 Apr 2020 12:23:54 -0700