The Effect of Mobility on Delayed Data Offloading

Delayed offloading is a widely accepted solution for mobile users to offload their traffic through Wi-Fi when they are moving in urban areas. However, delayed offloading enhances offloading efficiency at the expense of delay performance. Previous works mainly focus on the improvement of offloading efficiency while keeping delay performance in an acceptable region. In this paper, we study the impact of the user mobility on delayed data offloading in respect to the tradeoff between offloading efficiency and delay performance. We model a mobile terminal with delayed data offloading as an M/MMSP/1 queuing system with three service states. To be practical, we consider the feature of currently commercial mobile terminals in our analysis. Our analytical result shows that the mobility of the users can reduce the queueing delay incurred by the delayed offloading, and suggests that delayed offloading strategies can be optimized according to the mobility of the terminals once the delay requirement is given.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

05/27/2020

Designing and Analysis of A Wi-Fi Data Offloading Strategy Catering for the Preference of Mobile Users

In recent years, offloading mobile traffic through Wi-Fi has emerged as ...
02/07/2019

Exploiting Moving Intelligence: Delay-Optimized Computation Offloading in Vehicular Fog Networks

Future vehicles will have rich computing resources to support autonomous...
01/17/2018

Device Caching for Network Offloading: Delay Minimization with Presence of User Mobility

A delay-optimal caching problem (DOCP) in deviceto- device (D2D) network...
10/01/2021

Cellular traffic offloading via Opportunistic Networking with Reinforcement Learning

The widespread diffusion of mobile phones is triggering an exponential g...
01/30/2018

A Deep Reinforcement Learning Based Approach for Cost- and Energy-Aware Multi-Flow Mobile Data Offloading

With the rapid increase in demand for mobile data, mobile network operat...
10/02/2018

The Effect of Data Marshalling on Computation Offloading Decisions

We conducted an extensive set of experiments with an offloading testbed ...
01/11/2021

A Fault Tolerant Mechanism for Partitioning and Offloading Framework in Pervasive Environments

Application partitioning and code offloading are being researched extens...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

In recent years, the urban areas have witnessed the surge in mobile data traffic. A Cisco’s survey shows that the global mobile traffic has grown 17-fold over the past 5 years [1]. The explosion of mobile traffic has led to the cellular network overloaded problem that causes the ruins of users’ satisfaction [2]. Even though 5G will bring new spectrum to market, but the proliferating of devices and connections will demand additional bandwidth [3]. A widely accepted solution is to offload part of the mobile traffic through Wi-Fi interface with cheaper bandwidth [4]. This solution currently becomes more and more attractive to both mobile network operators and mobile users.

Nowadays, cellular network coverage is nearly ubiquitous in urban areas. To provide offloading service, many public places, such as residential areas, commercial districts and transportation hubs, in urban areas are installed with Wi-Fi hotspots [5]. When the mobile users are moving in these places, they pass through Wi-Fi network coverages and cellular network coverages alternately. After losing Wi-Fi signal, if users can tolerate certain delay to wait for the next Wi-Fi access point, they will be able to offload more traffic through Wi-Fi, such that they can keep their communication costs as low as possible.

Based on such idea, several kinds of delayed offloading strategies have been proposed in [6, 7, 8, 9, 10, 11]. The goal is to promote offloading efficiency, while keeping delay performance in an acceptable region. Herein, the offloading efficiency is defined as the ratio of the data offloaded via Wi-Fi to the total transmitted data. In [6], service requests enter a Wi-Fi buffer when the buffer is not full; otherwise, the requests will be transmitted through the cellular network. In [7] and [8], each file, e.g., emails or pictures, is assigned with a timer when it enters the Wi-Fi buffer. If this file is still waiting in the Wi-Fi queue when the timer reaches a preset deadline, it will be sent via the cellular network. Ref. [9] proposed to decide whether a newly arrived packet is directly transmitted via the cellular network, according to the Wi-Fi buffer length and the network connection. Clearly, these strategies implies that the mobile terminals are able to send the traffic through cellular network and Wi-Fi at the same time. However, such kind of concurrent transmission [12] is not supported by most of currently commercial mobile terminals [13].

In this paper, we study the data offloading problem of currently commercial mobile terminals, which can use only one kind of wireless channel to transmit traffic at the same time. Our goal is to find out if there is any factor that may affect the tradeoff between the offloading efficiency and the delay performance. We analyze the delayed data offloading by using an M/MMSP/1 queuing model with three service states. Our analytical results show that: though the offloading efficiency is enhanced at the expense of queuing delay, the moving speed of mobile users in motion is helpful to reduce the queuing delay incurred by the delayed offloading. This indicates that the deadline of delayed offloading strategies can be optimized according to the mobility of the terminals once the delay requirement is given.

The rest of this paper is organized as follows. In section II, we describe the delayed offloading strategy in detail, and show that the offloading procedure is essentially a three-state M/MMSP/1

queueing system. In section III, we establish a hybrid embedded Markov chain to derive the mean delay and the offloading efficiency. In section IV, we show that the moving speed of mobile users in motion can reduce the queuing delay incurred by the delayed offloading. Section V concludes this paper.

Ii Delayed Offloading of Terminals Without Concurrent Transmission

When people are moving in the urban area, they pass through Wi-Fi coverages and cellular network coverages alternately. For a mobile terminal without concurrent transmission capability, it perceives the wireless channel switching between two states over time, as illustrated in Fig. 1, where denotes the state that there is only cellular signal while represents the state that the Wi-Fi signal is available.

Assume that the duration times of wireless channel states and

are exponentially distributed with parameters

and , respectively. The wireless channel perceived by the mobile terminal in motion is a kind of Markov channel [14]. Clearly, given the deployment of Wi-Fi hotspots, the faster the user moves, the larger and are. Thus, we use to delineate the mobility of the user.

Fig. 1: Transition of wireless environment state.

Ii-a Delayed Offloading Procedure

In the face of such wireless environment, as described by the Markov channel in Fig. 1, we consider the delayed offloading procedure for currently commercial terminals as follows. When the Wi-Fi signal is available, the terminal transmits the traffic through Wi-Fi. Once the Wi-Fi connection is lost, the terminal pauses the traffic transmission to wait for the next Wi-Fi hotspot and randomly selects a deadline at the same time. If a Wi-Fi signal is available before the deadline expires, the terminal will recover the transmission through Wi-Fi; otherwise, it will go on with the transmission via the cellular network.

Hence, during the whole delayed offloading procedure, the terminal has three transmission (or service) states: (1) delayed state (or state 0), transmission is delayed, (2) cellular state (or state 1), transmission via the cellular network, and (3) Wi-Fi state (or state 2), transmission via Wi-Fi. The transitions among these service states are plotted in Fig. 2.

Ii-B Three-state Markov Modulated Service Process

Suppose that the deadline set for the delayed state is an exponential random variable with parameter

. The data transmission process of mobile terminals can be considered as a three-state Markov modulated service process (MMSP) [15]. Let be the transition rate from state to state in Fig. 2, where ,. According to Fig. 1 and 2, is given by

Fig. 2: State transition of the data transmission.
(1a)
(1b)
(1c)

It follows that the steady-state probabilities of each service state in Fig. 2 are given by

(2a)
(2b)
(2c)

where is the expectation of the deadline and is referred to as Wi-Fi available ratio in this paper since it actually indicates the ratio of the time that the terminal can perceive Wi-Fi signals.

Let be the transmission rate of state in Fig. 2. Clearly, the transmission rate of the delayed state is . It follows that the average transmission rate that a mobile terminal with the delayed offloading strategy can provide is given by:

(3)

Iii Analysis of Mean Delay and Offloading Efficiency

In this section, we analyze the performance of the data offloading of the terminals without concurrent transmission capability. Suppose the input traffic is a Poisson process with rate . The data transmission of the offloading procedure can be delineated as an M/MMSP/1 queueing system with three service states. The difficulty of the analysis of the M/MMSP/1 queue lies in the fact that the service time of a file is related to the service state when its service starts. To cope with this problem, we use the hybrid embedded Markov chain developed in [16].

Iii-a Embedded Points

Two types of time points are embedded into the data offloading process. We consider the epoch when a file starts its service, since the service time of files depends on the service state at this epoch. We also observe the epoch at the transition of service states, since the dependency of the service time is essentially caused by service state transitions during the service of a file. We thus define the two types of embedded points as follows:

  1. State-transition point : epoch when the service state transits to state ;

  2. Start-service point : epoch when a file starts its service while the service state is .

where . Clearly, the time interval between two adjacent embedded points is exponentially distributed.

Suppose the current epoch is an embedded point of which the service state is the delayed state , as Fig. 3(a) shows. Since the service is suspended at current epoch, the next event may be a state transition from service state 0 to service state after time which is an exponential random variable with parameter , where . Thus, the type of the next embedded point is determined by which kind of service state transition happens first. It follows that the distribution of the interval from current point to the next point is exponentially distributed with parameter and the next embedded point is with probability , where .

Similarly, when the current epoch is an embedded point of which the transmission state is state , the next embedded point will be with probability or with probability , where 3, as shown in Fig. 3(b) and (c). Also, the distribution of the interval from current point to the next point is exponentially distributed with parameter .

Fig. 3: Relationship between two kinds of embedded points.

Iii-B Start Service Probability

The start service probability is defined as the probability that a data file starts its service in state . Consider a newly arrived file, which sees files in the buffer. These files are labeled according to their sequence in queue. The head-of-line (HOL) file is labeled with 0 and the newly arrived file is labeled with . We define two types of conditional probabilities corresponding to the embedded points:

(12a)
(12b)
(12c)
  1. {the data file starts its service in service state the newly arrived file sees files in buffer}

  2. {the service state transits to state when the data file is in service the newly arrived file sees files in buffer}.

is defined on the embedded point , at which the file finishes its service when the service state is . Thus, the last event may be that the file starts its service when the service state is or that the service state transits to state when the file is in service. Therefore, for , the equations of in each state are obtained:

(4a)
(4b)
(4c)

Similarly, the equations of are given by:

(5a)
(5b)
(5c)

Combing (4) and (5), we have the relations between and :

(6)

where the coefficient matrix is

(7)

and

We solve (6) and obtain

(8a)
(8b)
(8c)

where and . When , is the probability that the HOL file starts its service when the service state is , given that the newly arrived file sees files in the buffer. Thus, , where is the stationary probability that there are files in the buffer and the service state is , and is the stationary probability that there are files in the buffer.

By definition, a newly arrived file that sees files in buffer upon its arrival starts its service in state is . Thus, the start service probability is

(9)

Combining equations (8) and (9), we have:

(10a)
(10b)
(10c)

where . The numerical solutions of and can be derived by establishing a two-dimensional continuous time Markov chain, which is not given in detail due to space limitation.

Iii-C Mean Service Time

Let be the time needed to serve a file if the file starts its service in state . Consider a file that the system is empty and in the delayed state when it arrives at the system. We say this file start its service in delayed state . The service state changes to the state in the next embedded point with probability , where . After that, the time this file still needed to finish the service is . Considering that the time from current point to the next embedded point is , the expectation of is given by:

(11a)
Similarly, we obtain in (11b) and in (11c).
(11b)
(11c)

Solving (11), we can derive in (12a)-(12c). And thus the mean service time:

(13)

Iii-D Mean Waiting Time and Mean Delay

The waiting time of a file is the duration that from the time it arrives at the system to the time it becomes the HOL file. It also equals to the sum of the residual service time of the HOL file and the service time of all the data files before this file. For the file in queue (), we define two types of conditional elapse time:

  1. : the expected time from the epoch when a newly arrived file becomes the file in queue while the service state is to the epoch when it becomes the HOL file, given that it sees files in the buffer when it arrives;

  2. : the expected time from the epoch when the service state transits to state while the newly arrived file is now the file in queue to the epoch when it becomes the HOL file, given that it sees files in the buffer when it arrives.

Following the similar arguments used to derive , we have

(14)

Let be the mean waiting time. Solving (14) and using the relation , we obtain:

(15)

Also, we have the mean delay as follows:

(16)

Iii-E Offloading Efficiency

Recall that the offloading efficiency, denoted by , is defined as the ratio of the traffic transmitted via Wi-Fi to the total traffic. We consider a very long time period . In this period, the total input traffic is . On the other hand, is the probability that server is transmitting traffic while the service state is , and thus the part of traffic served by Wi-Fi is . It follows that

(17)

Iv Performance Evaluations

In reality, the deployment of Wi-Fi hotspots in a city is fixed. The factors that may affect the performance of the offloading procedure are the deadline set for the delayed service state and the moving speed of the users. Based on the analytical results in Section III, we study how the deadline and the moving speed of users affect the performances such as mean delay and offloading efficiency.

We consider two application scenarios in this section. One is that the terminals are carried by pedestrians, of which the channel transition rate , , and thus . The other one is that the terminals are carried by vehicles. In this case, the channel transition rate , , and thus [8].

Iv-a

To facilitate discussion, we first consider the case where the transmission rate of the Wi-Fi connection and the cellular network are the same .

We plot the offloading efficiency and the mean delay versus the expectation of the deadline in Fig. 4, where file/s and files/s. It can be seen from Fig. 4 that both and monotonously increases with no matter what is, which means the offloading efficiency is improved at the expense of the mean delay. However, it is very interesting to see that, in the whole range of , the increments of are the same while those of are different under two application scenarios. For example, when the increment of is s when , and that is s when , though the increments of in both cases are 0.7. This implies that the terminal with a higher mobility or a larger tends to pay a smaller cost in the mean delay to obtain the same increment of the offloading efficiency, which is visualized by Fig. 5 where is plotted as a function of .

(a) efficiency vs. deadline
(b) delay vs. deadline
Fig. 4: Performance of the delayed offloading when
Fig. 5: Delay vs. efficiency when .

Based on the analytical results in Section III, we explain this point by considering two extreme cases of as follows. When is very small, the delayed service state in Fig. 2 disappears and the terminal will transmit the file immediately after it losses the Wi-Fi signal. In this case, though there are Wi-Fi state and cellular state as well as the state transitions, the transmission rate keeps unchanged over time, which implies that the service process and thus the queue length is independent of the service state transitions. In other words, the M/MMSP/1 queue now reduces to an M/M/1 queue with service rate . It follows that , and thus in (17) is now equal to , no matter what is. Also, the mean delay in (16) changes to for all s. On the other hand, when is extremely large, the cellular state in Fig. 2 disappears, and accordingly. In this case, the offloading process is actually an M/MMSP/1 with two service states, Wi-Fi state and delayed state, and the terminal only transmits traffic via Wi-Fi. Thus, in (17) increases to 1, for all s. Moreover, it is easy to show that in (16) goes to

(18)

if approaches to infinity. It is obvious that decreases with the mobility .

(a)
(b)
Fig. 6: Delay vs. efficiency when (a) (b) .

Iv-B

We also study the relationship between the mean delay and the efficiency when in Fig. 6, where all the parameters except and are all the same with those in Fig. 4. Especially, files/s and files/s in Fig. 6(a) and files/s and files/s in Fig. 6(b). The results in Fig. 6 show that the mobility of the terminal is helpful to reduce the expense of mean delay during the increase of offloading efficiency.

We thus conclude from the above discussion that the mobility of mobile users can reduce the delay incurred by the delay offloading. Based on this conclusion, mobile users can optimize the system performance according to their mobility. For example, a user has a delay requirement based on the delay tolerance. Our conclusion shows that the cost of delay to enhance the offloading efficiency is small when the mobility of the user is high. In this case, the user can greatly increase the deadline to improve the offloading efficiency while ensuring the delay requirement.

V Conclusion

In this paper, we analyze the delayed offloading problem of currently commercial mobile terminals. We develop a three-state M/MMSP/1 queueing model to derive the offloading efficiency and the mean delay. Through the analysis, we find that the mobility of the users plays an important role in the tradeoff between the offloading efficiency and the mean delay. In particular, the mobility of mobile users can reduces the delay incurred by the delayed offloading.

References

  • [1] (2019, February) Cisco visual networking index: Global mobile data traffic forecast update, 2017-2022. [Online]. Available: https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white-paper-c11-738429.pdf
  • [2] O. B. Yetim and M. Martonosi, “Dynamic adaptive techniques for learning application delay tolerance for mobile data offloading,” in Proc. IEEE INFOCOM, 2015, pp. 1885–1893.
  • [3] (2019, February) The 5g paradox: The need for more offloading options in the next-generation wireless era. [Online]. Available: https://wia.org/wp-content/uploads/WIA_Offload-web.pdf
  • [4] (2010) White paper - mobile data offloading through wi-fi. [Online]. Available: https://www.sourcesecurity.com/docs/moredocs/proximmicrosite/Mobile-Data-Offloading-Through-WiFi-V1.2.pdf
  • [5] M. Afanasyev, T. Chen, G. M. Voelker, and A. C. Snoeren, “Analysis of a mixed-use urban wifi network: when metropolitan becomes neapolitan,” in Proceedings of the 8th ACM SIGCOMM conference on Internet measurement, 2008, pp. 85–98.
  • [6] N. Cheng, N. Lu, N. Zhang, X. S. Shen, and J. W. Mark, “Opportunistic wifi offloading in vehicular environment: A queueing analysis,” in 2014 IEEE Global Communications Conference, 2014, pp. 211–216.
  • [7] K. Lee, J. Lee, Y. Yi, I. Rhee, and S. Chong, “Mobile data offloading: How much can wifi deliver?” IEEE/ACM Transactions on Networking (ToN), vol. 21, no. 2, pp. 536–550, 2013.
  • [8] F. Mehmeti and T. Spyropoulos, “Performance modeling, analysis, and optimization of delayed mobile data offloading for mobile users,” IEEE/ACM Transactions on Networking (TON), vol. 25, no. 1, pp. 550–564, 2017.
  • [9] A. Ajith and T. Venkatesh, “Qoe enhanced mobile data offloading with balking,” IEEE Communications Letters, vol. 21, no. 5, pp. 1143–1146, 2017.
  • [10] N. Wang and J. Wu, “Opportunistic wifi offloading in a vehicular environment: Waiting or downloading now?” in Proc. IEEE INFOCOM, 2016, pp. 1–9.
  • [11] S. Wiethölter, M. Emmelmann, R. Andersson, and A. Wolisz, “Performance evaluation of selection schemes for offloading traffic to ieee 802.11 hotspots,” in Proc. IEEE ICC, 2012, pp. 5423–5428.
  • [12] C. Hua, H. Yu, R. Zheng, J. Li, and R. Ni, “Online packet dispatching for delay optimal concurrent transmissions in heterogeneous multi-rat networks,” IEEE Transactions on Wireless Communications, vol. 15, no. 7, pp. 5076–5086, 2016.
  • [13]

    C. Zhang, B. Gu, Z. Liu, K. Yamori, and Y. Tanaka, “Cost- and energy-aware multi-flow mobile data offloading using markov decision process,”

    IEICE Transactions on Communications, vol. advpub, 2017.
  • [14] L. Huang and T. T. Lee, “Generalized pollaczek-khinchin formula for markov channels,” IEEE Transactions on Communications, vol. 61, no. 8, pp. 3530–3540, 2013.
  • [15] ——, “Queueing behavior of hybrid arq wireless system with finite buffer capacity,” in 2012 21st Annual Wireless and Optical Communications Conference (WOCC), 2012, pp. 32–36.
  • [16] J. Zhang, Z. Zhou, T. T. Lee, and T. Ye, “Delay analysis of three-state markov channels,” in 12th Int. Conf. Queueing Theory Network Applications, vol. 61, August 2017, pp. 101–117.