Latency-Energy Tradeoff based on Channel Scheduling and Repetitions in NB-IoT Systems

07/15/2018 ∙ by Amin Azari, et al. ∙ KTH Royal Institute of Technology Aalborg University 0

Narrowband IoT (NB-IoT) is the latest IoT connectivity solution presented by the 3GPP. NB-IoT introduces coverage classes and introduces a significant link budget improvement by allowing repeated transmissions by nodes that experience high path loss. However, those repetitions necessarily increase the energy consumption and the latency in the whole NB-IoT system. The extent to which the whole system is affected depends on the scheduling of the uplink and downlink channels. We address this question, not treated previously, by developing a tractable model of NB-IoT access protocol operation, comprising message exchanges in random-access, control, and data channels, both in the uplink and downlink. The model is then used to analyze the impact of channel scheduling as well as the interaction of coexisting coverage classes, through derivation of the expected latency and battery lifetime for each coverage class. These results are subsequently employed in investigation of latency-energy tradeoff in NB-IoT channel scheduling as well as determining the optimized operation points. Simulations results show validity of the analysis and confirm that there is a significant impact of channel scheduling on latency and lifetime performance of NBIoT devices.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 6

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Internet of Things (IoT) is behind 2 out of 3 major drivers of next generation wireless networks, which are massive IoT connectivity, mission critical IoT connectivity and enhanced mobile broadband (eMBB) [1]. Due to the fundamental differences in characteristics and service requirements between IoT and legacy traffic in cellular networks, which are seen in massive number of connected devices, short packet sizes, and long battery lifetimes, revolutionary connectivity solutions have been proposed and implemented by industry [2, 3]. The most prominent examples of such solutions are SigFox, introduced in 2009, and LoRa, introduced in 2015, both implemented in the unlicensed band, i.e., 868 MHz in Europe [3, 4]. On the other hand, the accommodation of IoT traffic over cellular networks has been investigated by the 3GPP, proposing evolutionary solutions like LTE Cat1 and LTE Cat-M [5, 6]. Recently, these efforts have been also complemented by introduction of revolutionary solutions like NB-IoT [7].

Fig. 1: NB-IoT features frequency-division duplex for uplink and downlink [8]. Downlink/uplink NP channels and signals are time multiplexed, as depicted in the figure.

NB-IoT represents a big step towards realization of massive IoT connectivity over cellular networks [9]. Communication in NB-IoT systems takes place in a narrow, bandwidth, resulting in more than link budget improvement over the legacy LTE. This enables smart devices deployed in remote areas, e.g., basements, to communicate with the base station (BS). As the legacy signaling and communication protocols were designed for large bandwidths, NB-IoT introduces a solution with five new narrowband physical (NP) channels [10, 8], see Fig. 1: random access channel (NPRACH), uplink shared channel (NPUSCH), downlink shared channel (NPDSCH), downlink control channel (NPDCCH), and broadcast channel (NPBCH). NB-IoT also introduces four new physical signals: demodulation reference signal (DMRS) that is sent with user data on NPUSCH, narrowband reference signal (NRS), narrowband primary synchronization signal (NPSS), and narrowband secondary synchronization signal (NSSS). Prior works on NB-IoT investigated preamble design for access reservation of devices over NPRACH [11, 12], uplink resource allocation to the connected devices [13], coverage and capacity analysis of NB-IoT systems in rural areas [14], coverage of NB-IoT with consideration of external interference due to deployment in guard band [15], and impact of channel coherence time on coverage of NB-IoT systems in [16]. Further, in [17], energy consumption of IoT devices in data transmission over NB-IoT systems in normal, robust, and extreme coverage scenarios has been investigated. The results obtained in [17] illustrate that NB-IoT significantly reduces the energy consumption with respect to the legacy LTE, due to the existence of the deep sleep mode for the devices that are registered to the BS.

In this paper, we address an important and so far untreated problem: when and how much resources to allocate to NPRACH, NPUSCH, NPDCCH, and NPDSCH in coexistence scenarios, where BS is serving NB-IoT devices with random activations that belong to different coverage classes. The solution to this problem has a significant impact on the service execution and devices’ performance, as the resource allocation to different channels faces inherent tradeoffs. The essence of the tradeoff can be explained as follows. If random access opportunities (NPRACH) occur frequently, less uplink radio resources remain for uplink data channel (NPUSCH), which increases the latency in data transmissions. On the other hand, if NPRACH is scheduled infrequently, latency and energy consumption in access reservation increase due to the extended idle-listening time and increased collision probability. Further, as device scheduling for uplink/downlink channels is performed over NPDCCH, infrequent scheduling of this channel may lead to wasted uplink resources in NPUSCH and increased latency in data transmissions. Conversely, if NPDCCH occurs frequently, the latency and energy consumption of transmissions over NPUSCH will increase. Another important aspect studied in the paper is the impact of signal repetitions that are used by the devices that are located far away from the BS on battery lifetime and latency performance of other devices in the system.

The remainder of the paper is structured as follows. In the next section, we outline the motivation for the development of a NB-IoT-specific analysis of channel scheduling and the reasons why the existing LTE models can not be used, and then we list the contributions of the paper. Section III is devoted to the system model. Section IV presents the analysis. Investigation of the operational tradeoffs and performance evaluation are presented in Section V. Concluding remarks are given in Section VI.

Ii Motivation and Contributions

The literature on latency and energy analysis and optimization for LTE networks is mature [18, 19]. Furthermore, latency and energy tradeoffs in IoT scheduling over LTE networks were investigated in [20, 21, 22]. However, although the NB-IoT access networking is heavily inspired by LTE, there are several crucial differences that prevent the use of the LTE models: (i) in NB-IoT, all communications happen in a single LTE resource block, and hence, control, broadcast, random access, and data channels are multiplexed on the same radio resource, (ii) a set of coverage classes has been defined, which enables devices experiencing extreme pathloss values to become connected by leveraging on repetitions of transmitted signals, and (iii) the control plane has been adapted to IoT characteristics, enabling the devices to become disconnected for several hours while they are registered to the BS, which is not possible in LTE. Further, the introduction of coverage classes also brings the novel concerns that are related to coexistence scenarios, where devices from different coverage classes are served within a cell and, thus, mutually impact their communication with the BS. For example, one may consider a scenario in which the uplink is mainly occupied by random access and data transmission of devices with poor coverage, when high numbers of repetitions are required. In such cases, the random access and data channels for other classes can not be scheduled frequently, which will affect their latency and energy performance. In order to properly address the distinguishing features of NB-IoT, in this paper we extend the latency/energy models in [23, 24, 20, 17], incorporate the NB-IoT channel multiplexing, and consider coexistence of devices from a diverse set of coverage classes in the same cell.

Specifically, the main contributions of this work are:

  • Derivation of a tractable analytical model of channel scheduling problem in NB-IoT systems that considers message exchanges on both downlink/uplink channels, from synchronization to service completion.

  • Derivation of closed-form analytical expressions for service latency and energy consumption, and derivation of the expected battery lifetime model for devices connected to the network.

  • Investigation of a latency-energy tradeoff in channel scheduling for NB-IoT systems.

  • Investigation of the interaction among the coverage classes coexisting in the system: performance loss in one coverage class due to an increase in number of connected devices from another coverage class.

Iii System Model

Iii-a NB-IoT Access Networking

Assume a NB-IoT cell with a base station located in its center, and

devices uniformly distributed in it. In general, there are

coverage classes defined in an NB-IoT cell, where the BS assigns a device to a class based on the estimated path loss between them and informs the device of its assignment. Class

, , is characterized by the number of replicas that must be transmitted per original data/control packet. For example, based on the specifications in [8], each device belonging to group shall repeat the preamble transmitted over NPRACH times. Further, denote by the fraction of devices belonging to class , by the number of communication sessions that a typical IoT device performs daily and by the probability that a device requests uplink service. The arrival rates of uplink/downlink service requests to the system are, respectively:

(1)
Fig. 2: Communications exchanges and power consumption in NB-IoT access networking. Note: Reference signals, including NRS, NPSS, NSSS, and master information block (MIB), are broadcasted regularly; here we show only a single realization.

Initially, when a NB-IoT device requires an uplink/downlink service, it first listens for the cell information, i.e., NPSS and NSSS, through which it synchronizes with the BS. Then, the device performs access reservation, by sending a random access (RA) request to the BS over NPRACH. The BS answers to a successfully received RA by sending the random access response (RAR) message over NPDCCH, indicating the resources reserved for serving the device. Finally, the device sends/receives data to/from the BS over NPUSCH/NPDSCH channels, which, depending on the application, may be followed by an acknowledgment (ACK) [8]. In contrast to LTE, a device that is connected to the BS can go to the deep sleep state [7, Section 7.3], which does not exist in LTE and from which the device can become reconnected just by transmitting a RA request accompanied by a random number [7, Fig.  7.3.4.5-1]. This new functionality aims to address the inefficient handling of IoT communications by LTE [22, 17], as it significantly saves energy due to the fact that IoT devices do not need to restart all steps of the connection establishment procedure. Fig. 2 represents the access protocol exchanges for NB-IoT, as described in [7, Section 7.3].111For the sake of completeness, we also mention another novel reconnection scheme designed for NB-IoT, in which a device can request to resume its previous connection after receiving the random access response (RAR) [10, Section III]. Towards this end, it needs to respond to the RAR message by transmission of its previous connection ID as well as the cause for resuming the connection.

Iii-B Problem Formulation

Based on the model presented in Fig. 2, the expected latencies in uplink/downlink communication in class are, respectively:

(2)

where , , , are the expected time spent in synchronization, resource reservation, data transmission in uplink service, and data reception in downlink service, respectively. Similarly, the models of expected energy consumption of an uplink/downlink communication in class are:

(3)

where , , , , and are the expected device energy consumption in synchronization, resource reservation, data transmission in uplink service, data reception in downlink service, and optional communications like acknowledgment, respectively. Since the energy consumption of a typical IoT device involved reporting application can be modeled as a semi-regenerative Poisson process with regeneration point at the end of each reporting period [25], one may define the expected battery lifetime as the ratio between stored energy and energy consumption per reporting period. In this case, the expected battery lifetime can be derived as:

(4)

where is the energy storage at the device battery. In order to derive closed-form latency and energy consumption expressions, e.g., model and , in the sequel we analytically investigate the performance impacts of channel scheduling, arrival traffic, and coexisting coverage classes on the performance indicators of interest.

Iv Analysis

As mentioned in Section II, in NB-IoT systems the control, data, random access, and broadcast channels are multiplexed on the same set of radio resources. Thus, their mutual impact in both uplink and downlink directions are significant, which is not the case in legacy LTE due to wide set of available radio resources. In the following, we propose a queuing model of NB-IoT access networking, which captures these interactions.

Iv-a Queuing Model of NB-IoT Access Protocol

Fig. 3 depicts the queuing model of NB-IoT access networking, comprising operation of NP random access, control, and data channels. The gray circle represents the uplink server serving two channel queues, NPRACH and NPUSCH, while the yellow circle represents the downlink channel serving three channel queues, NPDCCH, NPDSCH, as well as the reference signals, such as NPSS. Let be the average time interval between two consecutive scheduling of NPRACH of class and the number of orthogonal random access preambles available in it. The duration of scheduled NPRACH of class is , where is the unit length, equal to the NPRACH period for the coverage class with . The inter-arrival times between two NPRACH periods in NB-IoT can vary from to [8]. Further, let denote the fraction of time in which reference signals are scheduled in a downlink radio frame, e.g., NPBCH, NPSS, and NSSS. Five subframes in every two consecutive downlink frames are allocated to reference signals [8], implying . Finally, a semi-regular scheduling of NPDCCH has been proposed by 3GPP in order to prevent waste of resources in the uplink channel when BS serves another device with poor coverage in the downlink [26]; we denote by the average time interval between two consecutive NPDCCH instances. In the next section, we derive closed-form expressions for components of latency and battery lifetime models, given in (2)-(3).

Fig. 3: Queuing model of the NB-IoT access networking. The yellow and gray circles represent servers for downlink and uplink channels, respectively.

Iv-B Derivations

in (2) is a function of the coverage class . Its average value has been reported in [7, Sec. 7.3]. is given by:

(5)

in which represents the maximum allowed number of attempts, the probability of successful resource reservation in an attempt that depends on the number of devices in the class attempting the access, the expected latency in sending a RA message, and the expected latency in receiving the RAR message. is a function of time interval between consecutive scheduling of NPRACHs and is equal to , while depends on the operation of NPDCCH. NPDCCH can be seen as a queuing system in which the downlink server (see Fig. 3) visits the queue every seconds and serves the existing requests. Thus, consists of i) waiting for NPDCCH to occur, which happens on average seconds, ii) time interval spent waiting to be served when NPDCCH occurs, denoted by , and iii) transmission time, denoted by .

We first characterize . When the server visits the NPDCCH queue, on average there are:

(6)

requests waiting to be served, where the first term in corresponds to NPRACH-initiated random access requests, see (1), and models the the arrival of BS-initiated control signals, see Fig. 3. Thus, the average waiting time before the service of a newly arrived RA message starts is , where is the average service time in NPDCCH. Using as the average control packet transmission time, the average transmission time for class is . Thus:

(7)

and becomes:

(8)

Resource reservation of a device over NPRACH is successful if its transmitted preamble does not collide with other nodes’ preambles, which happens with probability , and the RA response is received within period , which happens with probability . Thus, the probability of successful resource reservation can be approximated as . For a device belonging to class , there are orthogonal preambles available every seconds, during which it contends on average with devices. Then, is derived as:

(9)

The cumulative distribution function of service time for a device and sum of service times for

devices are:

(10)

respectively, where is the unit step function. Then, , which is the probability that RAR is received within , is:

(11)

is a function of scheduling of NPUSCH. Operation of NPUSCH can be seen as a queuing system in which server handles requests in a fraction of each uplink frame that is allocated to NPUSCH; this fraction is . Arrival of service requests to the NPUSCH can be modeled as a batch Poisson process (BPP), as resource reservation happens only in NPRACH periods. The mean batch-size is:

(12)

and the rate of batch arrivals is . The uplink transmission time is determined by the packet size and coverage class

. We assume that the packet length follows a general distribution with the first two moments equal to

and . Then, the transmission (i.e., service) time for the uplink packet follows a general distribution with the first two moments:

where is the average uplink transmission rate for class . This queuing system is a BPP/G/1 system, hence, using the results from [27], one can derive the latency in data transmission for class as:

(13)

where . Similarly, performance of NPDSCH can be seen as a queuing system in which server visits the queue in a fraction of frame time and serves the requests. This fraction comprises to subframes in which NPDCCH, NPBCH, NPSS, and NSSS are not scheduled, and can be derived similarly to (8) as:

(14)

The arrival of downlink service requests to the NPDSCH queue can be also seen as a BPP, as they arrive only after NPRACH has occurred. The mean batch-size is:

(15)

and the arrival rate is . The downlink transmission time is determined by the packet size and coverage class . Assuming that packet length follows a general distribution with moments and , then first two moments of the distribution of the packet transmission time are:

where is the average downlink data rate for coverage class . Defining , the latency in data reception becomes:

(16)

Finally, we derive the average energy consumption of an uplink/downlink service. Denote by , , , , and the power amplifier efficiency, idle power consumption, circuit power consumption of transmission, listening power consumption, and transmit power consumption for class . Then,

(17)
(18)
(19)
(20)
(21)
(22)

from which the battery lifetime model (4) is derived as:

(23)
category parameters values
Traffic , , , ,
Traffic , , , ,
Traffic , , , , , 1/CF,
Traffic , 0.5, 0.5
Power , , , , , ,
Coverage , , ,
Coverage
Other , , ,
Other Commun. frame (CF) 10 ms
TABLE I: Parameters for performance analysis.

V Performance Evaluation

In this section, we validate the derived expressions, highlight performance tradeoffs in channel scheduling, find optimized system operation points, and identify the mutual impact among the coexisting coverage classes. System parameters are presented in Table I.

Fig. 4 compares the analytical lifetime and latency expressions derived in Section IV-B (dashed curves) against the simulation results (solid curves) for class 1 of devices. The -axis represents , the average time between two scheduling of random access resources. It obvious that the simulations results, including battery lifetime and service latency in uplink and downlink, match well with the respective analytical results.

Fig. 5 shows the mutual impact of two coexisting coverage classes in a cell, i.e., class 1 and class 2. The -axis represents the expected battery lifetime for both classes, while the -axis represents the the number of repetitions for class 2, i.e., . Increase in increases the amount of radio resources which are used for signal repetitions (i.e., coverage extension) of devices in class 2. This results in an increased latency both for class 1 and class 2 devices, and hence, increases the energy consumptions per reporting period and decreases the battery lifetime. Also, it can be seen that an increase in the fraction of nodes belonging to class 2, adversely impacts the battery lifetime performance for class 1 devices. For instance, increasing from 11 to 13 decreases the average battery lifetime of class 1 nodes for when (i.e., ) and for when (i.e., ). Nevertheless, the extended coverage enables devices in class 2 to become connected to the BS, i.e., provides a deeper coverage to indoor areas.

Fig. 5(a) shows the expected battery lifetime versus and , i.e., the time intervals between two consecutive scheduling of NPRACH and NPDCCH, respectively, for the same coexistence scenario. Increasing at first increases the lifetime of devices in both classes, as it provides more resources for NPUSCH scheduling and decreases time spent in data transmission, i.e., . After a certain point, increasing reduces the lifetime due to the increase of the expected time in resource reservation. Similarly, increasing at first increases the lifetime by providing more resources for NPDSCH, decreasing the time spent in data reception, while after a certain point it decreases the lifetime by increasing the expected time in resource reservation.

The impact of and on latency in uplink/downlink services is shown in Fig. 5(b)/Fig. 5(c). If the uplink/downlink latency, or the battery consumption represents the only optimization objective, it is straightforward to derive the optimized operation points. However, Figs. 5(a)-5(c) show that overall optimization of the objectives is coupled in conflicting ways. This is illustrated in Fig. 8, which shows normalized lifetime and latency for class 1 when one of the parameters, or , is fixed. For instance, when  ms, the downlink and uplink latency are minimized for and , and lifetime is maximized for . Also, when  ms, the downlink and uplink latencies are minimized for and , and lifetime is maximized for . Finally, Figs. 5(a)-5(c) show that the latency- and lifetime-optimized resource allocation strategy differ on class basis; thus, selecting the optimized values of and depends on required quality of service (lifetime and/or latency) for each class.

Fig. 4: Comparison of analytical and simulation results versus for class 1. , , bits, and Kbits.
Fig. 5: Mutual impact among two coexisting classes in a cell versus number of repetitions for the second class (, , , , , ).
(a) Battery lifetime versus and .
(b) Uplink latency versus and .
(c) Downlink latency versus and .
Fig. 6: Performance as function of and , which are time intervals between two scheduling of NPRACH and NPDCCH, respectively.
Fig. 7: Overall performance analysis for class 1 vs. ; .
Fig. 8: Overall performance analysis for class 1 vs. ; .

Vi Conclusion

NB-IoT access protocol scheduling has been investigated, and a tractable queuing model has been proposed to investigate impact of scheduling on service latency and battery lifetime. Using derived closed-form expressions, it has been shown that scheduling of random access, control, and data channels cannot be treated separately, as the expected latencies and energy consumptions in different channels are coupled in conflicting ways. Furthermore, the derived analytical model has been leveraged to investigate the performance impact of serving devices experiencing high pathloss, and thus needing of more signal repetitions, on latency and battery lifetime performance of other nodes. Finally, given the set of provisioned radio resources for NB-IoT and arrival traffic, optimized scheduling policies minimizing the experienced latency and maximizing the expected battery lifetime have been investigated.

Acknowledgment

The research presented in this paper was supported in part by Advanced Connectivity Platform for Vertical Segment (ACTIVE) and in part by the European Research Council (ERC Consolidator Grant Nr. 648382 WILLOW) within the Horizon 2020 Program.

References

  • [1] C. Mavromoustakis, G. Mastorakis, and J. M. Batalla, Internet of Things (IoT) in 5G mobile technologies.   Springer, 2016, vol. 8.
  • [2] É. Morin, M. Maman, R. Guizzetti, and A. Duda, “Comparison of the device lifetime in wireless networks for the internet of things,” IEEE Access, vol. 5, pp. 7097–7114, 2017.
  • [3] W. Yang et al., “Narrowband wireless access for low-power massive internet of things: A bandwidth perspective,” IEEE Wireless Communications, vol. 24, no. 3, pp. 138–145, 2017.
  • [4] B. Vejlgaard et al., “Interference impact on coverage and capacity for low power wide area IoT networks,” in IEEE WCNC, 2017, pp. 1–6.
  • [5] M. E. Soussi, P. Zand, F. Pasveer, and G. Dolmans, “Evaluating the Performance of eMTC and NB-IoT for Smart City Applications,” arXiv preprint arXiv:1711.07268, 2017.
  • [6] R. Ratasuk, N. Mangalvedhe, A. Ghosh, and B. Vejlgaard, “Narrowband LTE-M System for M2M Communication,” in IEEE VTC-Fall), Sept 2014, pp. 1–5.
  • [7] 3GPP TR 45.820, “Technical Specification Group GSM/EDGE Radio Access Network; Cellular system support for ultra-low complexity and low throughput Internet of Things (CIoT),” 2015.
  • [8] J. Schlienz and D. Raddino, “Narrowband internet of things,” Rohde and Schwarz, Tech. Rep., 08 2016.
  • [9] R. Ratasuk, B. Vejlgaard, N. Mangalvedhe, and A. Ghosh, “NB-IoT system for M2M communication,” in IEEE WCNC, 2016, pp. 1–5.
  • [10] Y. P. E. Wang et al., “A primer on 3GPP narrowband internet of things,” IEEE Communications Mag., vol. 55, no. 3, pp. 117–123, March 2017.
  • [11] X. Lin, A. Adhikary, and Y. P. E. Wang, “Random access preamble design and detection for 3GPP Narrowband IoT systems,” IEEE Wireless Communications Letters, vol. 5, no. 6, pp. 640–643, Dec 2016.
  • [12] T. Kim, D. M. Kim, N. Pratas, P. Popovski, and D. K. Sung, “An enhanced access reservation protocol with a partial preamble transmission mechanism in NB-IoT systems,” IEEE Communications Letters, June 2017.
  • [13] C. Yu et al., “Uplink scheduling and link adaptation for narrowband internet of things systems,” IEEE Access, vol. 5, pp. 1724–1734, 2017.
  • [14] M. Lauridsen et al., “Coverage and capacity analysis of LTE-M and NB-IoT in a rural area,” in IEEE VTC Fall, 2016, pp. 1–5.
  • [15] A. Adhikary, X. Lin, and Y. P. E. Wang, “Performance Evaluation of NB-IoT Coverage,” in IEEE VTC-Fall, Sept 2016, pp. 1–5.
  • [16] Y. D. Beyene, R. Jantti, K. Ruttik, and S. Iraji, “On the Performance of Narrow-Band Internet of Things (NB-IoT),” in 2017 IEEE WCNC, March 2017.
  • [17] P. A. Maldonado et al., “Narrowband IoT data transmission procedures for massive machine-type communications,” IEEE Network, vol. 31, no. 6, pp. 8–15, November 2017.
  • [18] Y. Chen, S. Zhang, S. Xu, and G. Y. Li, “Fundamental trade-offs on green wireless networks,” IEEE Communications Magazine, vol. 49, no. 6, pp. 30–37, June 2011.
  • [19] S. H. Alonso, M. R. Pérez, M. F. Veiga, and C. L. García, “Adaptive DRX Scheme to Improve Energy Efficiency in LTE Networks With Bounded Delay,” IEEE Journal on Selected Areas in Communications, vol. 33, no. 12, pp. 2963–2973, Dec 2015.
  • [20] A. Azari and G. Miao, “Network lifetime maximization for cellular-based M2M networks,” IEEE Access, vol. 5, pp. 18 927–18 940, 2017.
  • [21] A. Aijaz, M. Tshangini, M. R. Nakhai, X. Chu, and A. H. Aghvami, “Energy-Efficient Uplink Resource Allocation in LTE Networks With M2M/H2H Co-Existence Under Statistical QoS Guarantees,” IEEE Transactions on Communications, vol. 62, no. 7, pp. 2353–2365, July 2014.
  • [22] K. Wang, J. A. Zarate, and M. Dohler, “Energy-efficiency of LTE for small data machine-to-machine communications,” in 2013 IEEE ICC, June 2013, pp. 4120–4124.
  • [23] G. C. Madueno, J. J. Nielsen, D. M. Kim, N. K. Pratas, C. Stefanovic, and P. Popovski, “Assessment of LTE Wireless Access for Monitoring of Energy Distribution in the Smart Grid,” IEEE Journal on Selected Areas in Communications, vol. 34, no. 3, pp. 675–688, March 2016.
  • [24] P. A. Maldonado et al., “Optimized LTE data transmission procedures for IoT: Device side energy consumption analysis,” in 2017 ICC Workshops, May 2017, pp. 540–545.
  • [25] A. Azari and G. Miao, “Network lifetime maximization for cellular-based M2M networks,” IEEE Access, vol. 5, pp. 18 927–18 940, 2017.
  • [26] 3GPP TSG- RAN1 AdHoc NB-IoT, “NPDSCH resource allocation,” Tech. Rep., March 2016.
  • [27] H. Akimaru and K. Kawashima, Teletraffic: theory and applications.   Springer Science and Business Media, 2012.