Академический Документы
Профессиональный Документы
Культура Документы
METATRADER 5 — EXAMPLES
0 7 574
DMITRIY GIZLYK
Introduction
Cloud technologies are becoming more popular in the modern world. We are able to use paid or free storage services of various sizes. But is it possible to use them in practical trading? This
article proposes a technology for exchanging data between terminals using cloud storage services.
You may ask, why we need a cloud storage for this when we already have solutions for a direct connection between terminals. But I think, this approach has a number of advantages. First, a
provider remains anonymous: users get access to a cloud server instead of the provider's PC. Thus, the provider's computer is protected from virus attacks, and it does not have to be
permanently connected to the internet. It should connect only to send messages to the server. Second, a cloud may contain virtually unlimited number of providers. And third, as their number
of users increases, providers do not have to improve their computing capacities.
Let's use the free cloud storage of 15 GB provided by Google as an example. This is more than enough for our objectives.
1. A bit of theory
Authorization in Google Drive is arranged via the OAuth 2.0 protocol. This is an open authorization protocol that allows third-party applications and websites to have limited access to protected
resources of authorized users without the need to pass credentials. The basic OAuth 2.0 access scenario consists of 4 stages.
1. First, you need to get data for authorization (client's ID and secret). These data are generated by the website and, accordingly, are known to the site and the application.
2. Before the application can access the personal data, it should receive an access token. One such token can provide different access levels defined by the 'scope' variable. When the access
token is requested, the application can send one or more values in the 'scope' parameter. To create this request, the application can use the system browser and web service requests.
Some requests require an authentication step, at which users log in with their account. After logging in, users are asked if they are ready to grant permissions requested by the
application. The process is called the user consent. If the user grants consent, the authorization server provides the application the authorization code allowing the application to get the
access token. If the user does not grant permission, the server returns an error.
3. After the application receives the access token, it sends it in the HTTP authorization header. Access points are only valid for the set of operations and resources described in the request's
'scope' parameter. For example, if the access token for Google Drive has been released, it does not provide access to Google contacts. However, the application can send this access token
to Google Drive several times to perform allowed operations.
4. Tokens have a limited lifespan. If an application needs access after the access token has expired, it can receive an update token that allows the application to receive new access tokens.
Create a new project for the application. Go to the project panel ("Select a project" button or Ctrl + O). Create a new project (+).
In the newly opened page, set the project name, agree with the terms of use and confirm creation.
https://www.mql5.com/en/articles/3331 1/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
Select the new project from the panel and connect Google Drive API to it. To do this, select Drive API in the manager's API library and activate the API on a new page by clicking Enable.
The new page prompts us to create credentials to use the API. Click "Create credentials" to do that.
Google console offers the wizard for selecting the authentication type, but we do not need it. Click "client ID". Next, Google again warns us of the need to configure the access confirmation
page. Click "Configure consent screen" to do this.
In the newly opened page, leave all fields as default, filling only "Product name shown to users". Next, set the application type as "Other", specify the client name and click "Create". The
service generates "client ID" and "client secret" codes. You can copy them but this is not necessary: you can download them as a json file. Click "Ok" and download the json file with data for
accessing the local disk.
After that, our preparatory work on the service side is complete and we can start developing our applications.
As I said earlier, the bridge application is to be developed in C# using Google libraries. Let's create the Windows Form project in VisualStudio and add the Google.Apis.Drive.v3 library to it using
NuGet.
Next, let's create the GoogleDriveClass class to work with Google Drive:
class GoogleDriveClass
{
static string[] Scopes = { DriveService.Scope.DriveFile }; //Array for working with class
static string ApplicationName = "Google Drive Bridge"; //Application name
public static UserCredential credential = null; //Authorization keys
public static string extension = ".gdb"; //Extension for saved files
}
First, let's create the function for logging in to the service. It is to apply the previously saved json file with access codes. In my case, it is "client-secret.json". If you have saved a file under a
different name, specify it in the function code. After loading the log in data, the asynchronous authorization function on the service is called. In case of a successful log in, the token is saved in
https://www.mql5.com/en/articles/3331 2/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
the credential object for later access. When working in C#, do not forget about processing exceptions: in case of an exception, the credential object is reset.
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
GoogleDriveClass.Scopes,
"example.bridge@gmail.com",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
}
catch (Exception)
{
credential = null;
}
}
return (credential != null);
}
When working with Google Drive, our "bridge" should perform two functions: writing data to disk and reading the necessary file from it. Let us consider them in more detail. To implement such
seemingly simple functions, we need to write a number of procedures. The reason is that the Google Drive file system is different from the one we are accustomed to. Here, the names and file
extensions exist as separate entries only to maintain the customary presentation. In fact, when saving, each file is assigned a unique ID, under which it is stored. Thus, users can save an
unlimited number of files with the same name and extension. Therefore, before accessing the file, we need to know its cloud store ID. To do this, load the list of all files on the disk and
compare their names one-by-one with the specified one.
The GetFileList function is responsible for obtaining the file list. It returns the Google.Apis.Drive.v3.Data.File class list. Let's use the Google.Apis.Drive.v3.DriveService class from the
previously downloaded libraries to receive the file list from Google Drive. When initializing the class, we pass to it the token obtained when logging in together with our project name. The
resulting list is stored in the returned result variable. In case of exceptions, the variable is reset to zero. The file list is requested and processed as the necessity arises in other functions of our
application.
// List files.
result = listRequest.Execute().Files;
}
catch (Exception)
{
return null;
}
}
return result;
}
{
var body = new File();
body.Name = name;
body.MimeType = "text/json";
body.ViewersCanCopyContent = true;
The next step after creating the file is the update function. Let's recall the objectives of our application and the features of the Google Drive file system. We are developing the application for
exchanging data between several terminals located on different PCs. We do not know at what time and to how many terminals our information is required. But the features of the cloud file
system allow us to create several files with the same names and extensions. This enables us to first create a new file with new data and delete the obsolete data from the cloud storage
afterwards. This is what the FileUpdate function does. Its input parameters are the name of the file and its contents, and it returns a logical value of the operation result.
At the beginning of the function, we declare the new_id text variable and call the previously created FileCreate function which in turn creates a new data file in the cloud and returns the new
file ID to our variable.
Then we get the list of all files in the cloud from the GetFileList function and compare them with the name and ID of the newly created file one-by-one. All unnecessary duplicates are removed
from the storage. Here we again use the already known Google.Apis.Drive.v3.DriveService class, while requests are sent using the Google.Apis.Drive.v3.FilesResource.DeleteRequest class.
string new_id;
if (FileCreate(name, value, out new_id))
{
https://www.mql5.com/en/articles/3331 3/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
IList<File> files = GetFileList();
if (files != null && files.Count > 0)
{
result = true;
try
{
using (var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
}))
{
foreach (var file in files)
{
if (file.Name == name && file.Id != new_id)
{
try
{
Google.Apis.Drive.v3.FilesResource.DeleteRequest request = service.Files.Delete(file.Id);
string res = request.Execute();
}
catch (Exception)
{
continue;
}
}
}
}
}
catch (Exception)
{
return result;
}
}
}
return result;
}
After obtaining the file ID, we are able to retrieve from it the data we need. To do this, we need the FileRead function, to which the necessary file ID is passed, while the function returns its
contents. If unsuccessful, the function returns an empty string. As before, we need the Google.Apis.Drive.v3.DriveService class to create a connection and the
Google.Apis.Drive.v3.FilesResource.GetRequest class to create a request.
if (result)
{
int start = 0;
int count = (int)stream.Length;
value = Encoding.Default.GetString(stream.GetBuffer(), start, count);
}
}
}
}
return value;
}
Create the function for launching the PipesCreate operational threads. In this function, we initialize the array of our threads and launch them in a loop. When launching each thread, the
ServerThread function is called to initialize the functions in our threads.
https://www.mql5.com/en/articles/3331 4/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
for (i = 0; i < numThreads; i++)
{
servers[i] = new Thread(ServerThread);
servers[i].Start();
}
}
Also, a named pipe is created and the asynchronous function of waiting for a client connection to the pipe is launched at the start of each thread. When connecting the client to the pipe, the
Connected function is called. To achieve this, we create the AsyncCallback asyn_connected delegate. If an exception occurs, the thread is restarted.
When a client connects to a named pipe, we check the state of the pipe and, in case of an exception, restart the thread. If the connection is stable, we start the function of reading the request
from the application. If the reading function returns false, restart the connection.
while (pipeServer.IsConnected)
{
if (!ReadMessage(pipeServer))
{
exit = true;
break;
}
}
//Wait for a client to connect
AsyncCallback asyn_connected = new AsyncCallback(Connected);
pipeServer.Disconnect();
pipeServer.BeginWaitForConnection(asyn_connected, pipeServer);
break;
}
}
finally
{
exit = true;
}
}
The ReadMessage function reads and processes the request from applications. A reference to the thread object is passed to the function as a parameter. The result of the function is the logical
value of the operation. First, the function reads the application request from the named pipe and divides it into fields. Then it recognizes the command and performs the necessary actions.
The function features three commands:
To close the current connection, the function should simply return false. The Connected function that called it does all the rest.
To execute the file read request, we should define the file ID and read its contents using the GetFileID and FileRead functions described above.
After executing the file write function, call the previously created FileUpdate function.
Of course, do not forget about exception handling. In case of an exception, log in to Google again.
if (message.Trim() == "Close\0")
return false;
if (arr_message[0].Trim() == "Write")
{
try
{
result = (Drive.FileUpdate(arr_message[1].Trim() + GoogleDriveClass.extension, arr_message[2].Trim()) ? "Ok" : "Error");
}
catch (Exception e)
https://www.mql5.com/en/articles/3331 5/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
{
result = "Error " + e.ToString();
Drive.Authorize();
}
After processing the requests, we should return the result of the operation to the application. Let's create the WriteMessage function. Its parameters are a reference to the object of the
current named pipe and a message sent to the application. The function returns the logical value about the operation result.
Now that we have described all the necessary functions, it is time to run the PipesCreate function. I created the Windows Form project, so I run this function from the Form1 function.
public Form1()
{
InitializeComponent();
PipesCreate();
}
All we have to do now is re-compile the project and copy the json file with the cloud storage access data to the application folder.
enum ENUM_SET_TYPE
{
ENUM_SET_TYPE_INTEGER=0,
ENUM_SET_TYPE_DOUBLE=1,
ENUM_SET_TYPE_STRING=2
};
Create the CCopyObject class for processing the object data. A string parameter is passed to it during initialization. Subsequently, it identifies the objects created by our class on the chart. We
will save this value to the s_ObjectsID class variable.
class CCopyObject
{
private:
string s_ObjectsID;
public:
CCopyObject(string objectsID="CopyObjects");
~CCopyObject();
};
//+------------------------------------------------------------------+
//| |
//+------------------------------------------------------------------+
CCopyObject::CCopyObject(string objectsID="CopyObjects")
{
s_ObjectsID = (objectsID==NULL || objectsID=="" ? "CopyObjects" : objectsID);
}
For example, the HLineToString function is called to describe a horizontal line. Chart ID and object name are used as its parameters. The function is to return the structured string with the
object parameter. For example, for a horizontal line, the passed parameters are price, color, line width and whether the line is displayed in front of the chart or in the background. Do not
forget to set the parameter type from the previously created enumeration before the parameter property.
result+=IntegerToString(ENUM_SET_TYPE_DOUBLE)+"="+IntegerToString(OBJPROP_PRICE)+"=0="+DoubleToString(ObjectGetDouble(chart,name,OBJPROP_PRICE,0))+"|";
result+=IntegerToString(ENUM_SET_TYPE_INTEGER)+"="+IntegerToString(OBJPROP_COLOR)+"=0="+IntegerToString(ObjectGetInteger(chart,name,OBJPROP_COLOR,0))+"
result+=IntegerToString(ENUM_SET_TYPE_INTEGER)+"="+IntegerToString(OBJPROP_STYLE)+"=0="+IntegerToString(ObjectGetInteger(chart,name,OBJPROP_STYLE,0))+"
result+=IntegerToString(ENUM_SET_TYPE_INTEGER)+"="+IntegerToString(OBJPROP_BACK)+"=0="+IntegerToString(ObjectGetInteger(chart,name,OBJPROP_BACK,0))+"|"
result+=IntegerToString(ENUM_SET_TYPE_INTEGER)+"="+IntegerToString(OBJPROP_WIDTH)+"=0="+IntegerToString(ObjectGetInteger(chart,name,OBJPROP_WIDTH,0))+"
result+=IntegerToString(ENUM_SET_TYPE_STRING)+"="+IntegerToString(OBJPROP_TEXT)+"=0="+ObjectGetString(chart,name,OBJPROP_TEXT,0)+"|";
result+=IntegerToString(ENUM_SET_TYPE_STRING)+"="+IntegerToString(OBJPROP_TOOLTIP)+"=0="+ObjectGetString(chart,name,OBJPROP_TOOLTIP,0);
return result;
}
https://www.mql5.com/en/articles/3331 6/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
Similarly, create the functions for describing other object types. In my case, these are VLineToString for a vertical line, TrendToString for a trend line and RectangleToString for a rectangle.
The codes of these functions can be found in the attached class code.
4.1.2. Function for plotting objects on a chart
We have created the function for data collection. Now, let's develop the function that reads the messages and plots objects on the chart: DrawObjects. Its parameters are the chart ID and a
received message. The function returns the logical value of the operation execution.
The function of assigning properties received in a message to a chart object is universal and applicable to any object types. The chart ID, the object name and a string array of parameters are
passed to it as parameters. Each array element is divided into an operation type, a property, a modifier and a value. The obtained values are assigned to the object through a function
corresponding to the type of operation.
for(int i=0;i<total_settings;i++)
{
string setting[];
int total=StringSplit(settings[i],'=',setting);
if(total<3)
continue;
switch((ENUM_SET_TYPE)StringToInteger(setting[0]))
{
case ENUM_SET_TYPE_INTEGER:
ObjectSetInteger(chart,name,(ENUM_OBJECT_PROPERTY_INTEGER)StringToInteger(setting[1]),(int)(total==3 ? 0 : StringToInteger(setting[2])),StringT
break;
case ENUM_SET_TYPE_DOUBLE:
ObjectSetDouble(chart,name,(ENUM_OBJECT_PROPERTY_DOUBLE)StringToInteger(setting[1]),(int)(total==3 ? 0 : StringToInteger(setting[2])),StringToD
break;
case ENUM_SET_TYPE_STRING:
ObjectSetString(chart,name,(ENUM_OBJECT_PROPERTY_STRING)StringToInteger(setting[1]),(int)(total==3 ? 0 : StringToInteger(setting[2])),setting[t
break;
}
}
return true;
}
After plotting the objects on the chart, we need to compare the objects present on the chart with the ones passed in the message. "Unnecessary" objects containing the necessary ID but not
present in the message are removed from the chart (these are the objects removed by the provider). The DeleteExtraObjects function is responsible for that. Its parameters are the chart ID and
the array of structures containing the object name and type.
https://www.mql5.com/en/articles/3331 7/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
{
found=true;
break;
}
}
if(!found)
{
if(ObjectDelete(chart,name))
i--;
}
}
return;
}
Include the necessary libraries in the application header. These are the class for working with graphical objects described above and the base class for working with named pipes. Also, specify
the pipe name the application is connected to.
#include <CopyObject.mqh>
#include <Files\FilePipe.mqh>
In the global variables, declare the class for working with graphical objects, string variable for saving the last sent message and the uchar array the command for closing the connection to the
cloud storage is written to.
CCopyObject *CopyObjects;
string PrevMessage;
uchar Close[];
In the OnInit function, initialize global variables and launch the function for sending data to the cloud storage if necessary.
int OnInit()
{
//---
CopyObjects = new CCopyObject();
PrevMessage="Init";
StringToCharArray(("Close"),Close,0,WHOLE_ARRAY,CP_UTF8);
if(SendAtStart)
SendMessage(ChartID());
//---
return(INIT_SUCCEEDED);
}
In the OnDeinit function, delete the object class for working with graphical objects.
The function for sending info messages to the cloud storage is called from the OnChartEvent function when an object is created, modified or removed from the chart.
The main operations are performed in the SendMessage function applying the chart ID as an input. Its algorithm can be divided into several stages:
checking the status of the class for working with graphical objects and re-initializing it if necessary;
generating the message to be sent to the cloud using the previously created CreatMessage function, exiting the function if the message is empty or equal to the last sent one;
creating the file name for sending to the cloud based on the chart symbol;
establishing connection to our bridge application via the named pipe;
passing an order to send a message to the cloud storage with the specified file name via the open connection;
sending an order to close the connection to the cloud and breaking the named pipe to the bridge application after receiving a response about the send order execution;
removing the objects of working with named pipes before exiting the application.
During the execution of operations, we display information messages in the comments to the chart.
string Name=SymbolInfoString(ChartSymbol(chart),SYMBOL_CURRENCY_BASE)+SymbolInfoString(ChartSymbol(chart),SYMBOL_CURRENCY_PROFIT);
CFilePipe *pipe=new CFilePipe();
int handle=pipe.Open(Connection,FILE_WRITE|FILE_READ);
if(handle<=0)
{
Comment("Pipe doesn't found");
delete pipe;
return false;
}
uchar iBuffer[];
int size=StringToCharArray(("Write;"+Name+";"+message),iBuffer,0,WHOLE_ARRAY,CP_UTF8);
if(pipe.WriteArray(iBuffer)<=0)
{
Comment("Error of sending request");
pipe.Close();
delete pipe;
return false;
}
ArrayFree(iBuffer);
uint res=0;
do
{
res=pipe.ReadArray(iBuffer);
}
while(res==0 && !IsStopped());
if(res>0)
{
string result=CharArrayToString(iBuffer,0,WHOLE_ARRAY,CP_UTF8);
https://www.mql5.com/en/articles/3331 8/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
if(result!="Ok")
{
Comment(result);
pipe.WriteArray(Close);
pipe.Close();
delete pipe;
return false;
}
}
PrevMessage=message;
pipe.WriteArray(Close);
pipe.Close();
delete pipe;
Comment("");
return true;
}
#include <CopyObject.mqh>
#include <Files\FilePipe.mqh>
The application is to feature three external parameters: time in seconds indicating the periodicity of the cloud storage data update, objects ID on the chart and the logical value indicating the
need to remove all created objects from the chart when the application is closed.
In the global variables (just like in the provider application), declare the class for working with graphical objects, string variable for saving the last received message and the uchar array the
command for closing the connection to the cloud storage is written to. In addition, add the logic variable about the state of the timer and variables to store the time of the last update and
displaying the last comment on the chart.
CCopyObject *CopyObjects;
string PrevMessage;
bool timer;
datetime LastRefresh,CommentStart;
uchar Close[];
int OnInit()
{
//---
CopyObjects = new CCopyObject(ObjectsID);
PrevMessage="Init";
timer=EventSetTimer(1);
if(!timer)
{
Comment("Error of set timer");
CommentStart=TimeCurrent();
}
LastRefresh=0;
StringToCharArray(("Close"),Close,0,WHOLE_ARRAY,CP_UTF8);
//---
return(INIT_SUCCEEDED);
}
In the OnDeinit deinitialization function, delete the object class for working with graphical objects, stop the timer, clear the comments and (if necessary) remove the objects created by the
application from the chart.
In the OnTick function, check the timer status and re-activate it if needed.
void OnTick()
{
//---
if(!timer)
{
timer=EventSetTimer(1);
if(!timer)
{
Comment("Error of set timer");
CommentStart=TimeCurrent();
}
OnTimer();
}
}
In the OnTimer function, clear the comments that are present on the chart longer than 10 seconds and call the function for reading the data file from the cloud storage (ReadMessage). After
the data has been loaded successfully, the time of the last data update is changed.
void OnTimer()
{
//---
if((TimeCurrent()-CommentStart)>10)
{
Comment("");
}
if((TimeCurrent()-LastRefresh)>=RefreshTime)
{
if(ReadMessage(ChartID()))
{
LastRefresh=TimeCurrent();
}
}
}
The basic actions for loading data from the cloud storage and plotting objects on the chart are performed in the ReadMessage function. The function has only one parameter — chart ID the
function works with. The operations performed in the function can be divided into several stages:
generating the file name by the chart symbol for reading from the cloud;
opening a named pipe for connecting to the bridge application;
sending a data reading request from the cloud storage specifying the required file;
reading the request processing result;
sending an order to close the connection to the cloud and breaking the named pipe to the bridge application;
comparing the obtained result with the previous message. If the data are similar, exit the function;
passing the obtained message to the DrawObjects function of the graphical elements processing class object;
saving the successfully processed message in the PrevMessage variable for the subsequent comparison with the obtained data.
https://www.mql5.com/en/articles/3331 9/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
int handle=pipe.Open(Connection,FILE_WRITE|FILE_READ);
if(handle<=0)
{
Comment("Pipe doesn't found");
CommentStart=TimeCurrent();
delete pipe;
return false;
}
Comment("Send request");
uchar iBuffer[];
int size=StringToCharArray(("Read;"+Name+";"),iBuffer,0,WHOLE_ARRAY,CP_UTF8);
if(pipe.WriteArray(iBuffer)<=0)
{
pipe.Close();
delete pipe;
return false;
}
Sleep(10);
ArrayFree(iBuffer);
Comment("Read message");
uint res=0;
do
{
res=pipe.ReadArray(iBuffer);
}
while(res==0 && !IsStopped());
Sleep(10);
Comment("Close connection");
pipe.WriteArray(Close);
pipe.Close();
delete pipe;
Comment("");
string result=NULL;
if(res>0)
{
result=CharArrayToString(iBuffer,0,WHOLE_ARRAY,CP_UTF8);
if(StringFind(result,"Error",0)>=0)
{
Comment(result);
CommentStart=TimeCurrent();
return false;
}
}
else
{
Comment("Empty message");
return false;
}
if(result==PrevMessage)
return true;
if(CheckPointer(CopyObjects)==POINTER_INVALID)
{
CopyObjects = new CCopyObject();
if(CheckPointer(CopyObjects)==POINTER_INVALID)
return false;
}
if(CopyObjects.DrawObjects(chart,result))
{
PrevMessage=result;
}
else
{
return false;
}
return true;
}
Enter the email address you provided when registering your Google account and go to the next page (NEXT button). On the next page, enter the password for accessing the account.
https://www.mql5.com/en/articles/3331 10/11
8/2/2020 Using cloud storage services for data exchange between terminals - MQL5 Articles
On the next page, Google will ask you to confirm the application's access rights to the cloud storage. Review the requested access rights and confirm them (ALLOW button).
The drive-bridge.json subfolder is created in the bridge application directory. It stores the file containing the access token of the cloud storage. In the future, when replicating the application
on other computers, this subdirectory should also be copied together with the bridge program. This relieves us from the necessity to repeat the procedure and transfer the cloud storage access
data to third parties.
Conclusion
In this article, we examined using a cloud storage for practical purposes. The bridge application is a universal tool for uploading data to the cloud storage and loading it back into our
applications. The proposed solution for transmitting graphical objects allows you to share your technical analysis results with your colleagues in real time. Perhaps, someone will decide to
provide trading signals or arrange training courses on technical analysis of charts this way.
Warning: All rights to these materials are reserved by MetaQuotes Ltd. Copying or reprinting of these materials in whole or in part is prohibited.
https://www.mql5.com/en/articles/3331 11/11