Skip to main content
    In this paper, we will introduce a Bayesian semiparametric model concerned with both constant and coefficients. In Meta-Analysis or Meta-Regression, we usually use a parametric family. However, lately the increasing tendency to use... more
    In this paper, we will introduce a Bayesian semiparametric model concerned with both constant and coefficients. In Meta-Analysis or Meta-Regression, we usually use a parametric family. However, lately the increasing tendency to use Bayesian nonparametric and semiparametric models, entered this area too. On the other hand, although we have some works on Bayesian nonparametric or semiparametric models, they just focus on intercept and do not pay much attention to regressor coefficient(s). We also would check the efficiency of the proposed model via simulation and give an illustrating example.
    Deaf people or people with hearing loss have a major problem in everyday communication. There are many applications available in the market to help blind people to interact with the world. Voice-based email and chatting systems are... more
    Deaf people or people with hearing loss have a major problem in everyday communication. There are many applications available in the market to help blind people to interact with the world. Voice-based email and chatting systems are available to communicate with each other by blinds. This helps to interact with persons by blind people. Also, many attempts have been made with Sign Language (SL) translators to solve of communication gap between normal and deaf people and ease communication for deaf people. In this paper, the geometric feature is used as feature extraction for static sign recognition. Support Vector Machine (SVM) classifier is used for training and testing to develop a system using static signs. So, the accuracy result for static signs using the Geometric feature is 62.92% which needs to be improved by other feature extraction and classifiers.
    So many natural phenomena of determining relationship and the effect of input variables on response variable in statistical studies may be different from the suggested model that the researcher selects for his study due to the occupant... more
    So many natural phenomena of determining relationship and the effect of input variables on response variable in statistical studies may be different from the suggested model that the researcher selects for his study due to the occupant exists in the structure of data. It may be so influential on different distributions considered for response variables. The optimal properties of estimators were evaluated and studied for two statistical variables considered for response variable and input variables in the suggested model. It has been simulated for study, and real data has also been investigated. The results confirmed the superiority of a model close to the structure of the data.
    One of the important challenges in Wireless Sensor Networks is to proceed with data transmission in a way that tries to increase the network's life. One of the main issues is the reduction of latency in the node and energy in the sink... more
    One of the important challenges in Wireless Sensor Networks is to proceed with data transmission in a way that tries to increase the network's life. One of the main issues is the reduction of latency in the node and energy in the sink nodes. Due to the limited energy of the nodes, data transmission has the largest share in energy consumption, so it is important to design a structure that has the least amount of energy in sending data to the base station. In this paper, we use fuzzy logic and the Mamdani method for clustering to solve the challenge and time-division multiplexing method to connect the nodes with the header. The proposed clustering is based on the use of the LEACH algorithm, that improves its capability, and reliability by fuzzy systems, and the particle optimization algorithm is used to optimize the path of the networks. The simulation results show that energy consumption decreases with an increasing number of cycles. For example, energy consumption reached 0.9 in the 2000 round and 0.1 in the 5000 round.
    In this paper, a new adaptive Monte Carlo algorithm is proposed to solve the systems of linear algebraic equations arising from the Black-Scholes model to price European and American options. The proposed algorithm, offers several... more
    In this paper, a new adaptive Monte Carlo algorithm is proposed to solve the systems of linear algebraic equations arising from the Black-Scholes model to price European and American options. The proposed algorithm, offers several advantages over the conventional and previous adaptive Monte Carlo algorithms. The corresponding properties of the algorithm and Convergence theories are discussed, and numerical experiments are presented, which demonstrate the computational efficiency of the proposed algorithm. The results are also compared with other methods.
    Due to several reasons as the low resistance of constructed concrete and also change in codes or application of structures, some concrete frames need to be retrofitted. By adding the steel prop and curb to the reinforced concrete, many... more
    Due to several reasons as the low resistance of constructed concrete and also change in codes or application of structures, some concrete frames need to be retrofitted. By adding the steel prop and curb to the reinforced concrete, many parameters such as ductility, resistance, and stiffness change. This study numerically investigates the impact of adding the prop and curb, slit damper, gusset plate, and also prop with a ductile ring on stiffness, resistance, energy dissipation, and ductility of RC frames. For this purpose, the effect of the aforementioned methods on the linear and nonlinear moment frame behavior of reinforced concrete under monotonic loads has been numerically investigated using the ABAQUS software. In the present study, 12 samples of reinforced frames with one story and one span were retrofitted by different methods. The novelty of the paper was using such props and slit damper in RC frames. The results obtained from the modeling showed that the retrofitted frame with a ring, slit dampers, and gusset plate also showed better behavior in terms of resistance and stiffness compared to the RC frame and the sample with slit damper and prop with a ductile ring as well as compared to the sample with the prop and curb showed more ductility and energy dissipation.
    In this paper, we consider the problem of parameter estimation in negative binomial mixed model when it is suspected that some of the fixed parameters may be restricted to a subspace via linear shrinkage, preliminary test, shrinkage... more
    In this paper, we consider the problem of parameter estimation in negative binomial mixed model when it is suspected that some of the fixed parameters may be restricted to a subspace via linear shrinkage, preliminary test, shrinkage preliminary test, shrinkage, and positive shrinkage estimators along with the unrestricted maximum likelihood and restricted estimators. The random effects are considered as nuisance parameters. We conduct a Monte Carlo simulation study to evaluate the performance of each estimator in the sense of simulated relative efficiency. The results of the simulation study reveal that the proposed estimation strategies perform better than the maximum likelihood method. The proposed estimators are applied to a real dataset to appraise their performance.
    The aim of this paper is to learn the Bayesian network structure for discrete variables. For this purpose, we introduce a Gibbs sampler method. Each sample represents a Bayesian network. Thus, in the process of Gibbs sampling, we obtain a... more
    The aim of this paper is to learn the Bayesian network structure for discrete variables. For this purpose, we introduce a Gibbs sampler method. Each sample represents a Bayesian network. Thus, in the process of Gibbs sampling, we obtain a set of Bayesian networks. For achieving a single graph that represents the best graph fitted on data, we use the mode of burn-in graphs. This means that the most frequent edges of burn-in graphs are considered to indicate the best single graph. The results on the well-known Bayesian networks show that our method has higher accuracy in learning Bayesian network structure..
    In this paper, the probability of failure-free operation until time t, along with the probability of stress-strength, based on progressive censoring data, is studied in a family of lifetime distributions. The classic maximum likelihood... more
    In this paper, the probability of failure-free operation until time t, along with the probability of stress-strength, based on progressive censoring data, is studied in a family of lifetime distributions. The classic maximum likelihood estimator (MLE) is proposed for unknown parameters. For a numerical demonstration of the proposed estimation strategies, some bootstrap confidence intervals, are constructed. The theoretical results are illustrated by a real data examples and an extensive simulation study. Simulation shreds of evidence revealed that our proposed strategies perform well in estimating parameters based on progressive censoring data. Finally, we applied the proposed methodology to estimate the probability of failure-free until time to a breakdown of an insulting fluid between electrodes and Stress-Strength reliability of the carbon fibers as well.
    Interval-valued data are observed as ranges instead of single values and contain richer information than single-valued data. Meanwhile, interval-valued data are used for interval-valued characteristics, for instance, daily temperature,... more
    Interval-valued data are observed as ranges instead of single values and contain richer information than single-valued data. Meanwhile, interval-valued data are used for interval-valued characteristics, for instance, daily temperature, daily stock price, censoring times, grouped data, etc. Recent years have witnessed an increasing interest in interval-valued data analysis. Therefore, interval-valued variables have attracted unprecedented attention in the last decade. Recently, different linear regression approaches have been introduced to analyze interval-valued data. If distributions of response variables belong to the exponential family of distributions, the generalized linear models framework is used for modeling the relationships between interval-valued variables. An interval generalized linear model is proposed for the first time in this research. Then a suitable model is presented to estimate the parameters of the interval generalized linear model. The two models are provided based on interval arithmetic. The estimation procedure of the parameters of the suitable model is as the estimation procedure of the parameters of the interval generalized linear model. The least-squares (LS) estimation of the suitable model is developed according to a nice distance in the interval space. The LS estimation is resolved analytically through a constrained minimization problem. Then some desirable properties of the estimators are checked. Finally, both the theoretical and the empirical performance of the estimators are investigated.
    The multilinear normal distribution is a widely used tool in the tensor analysis of magnetic resonance imaging (MRI). Diffusion tensor MRI provides a statistical estimate of a symmetric 2 nd-order diffusion tensor for each voxel within an... more
    The multilinear normal distribution is a widely used tool in the tensor analysis of magnetic resonance imaging (MRI). Diffusion tensor MRI provides a statistical estimate of a symmetric 2 nd-order diffusion tensor for each voxel within an imaging volume. In this article, tensor elliptical (TE) distribution is introduced as an extension to the multilinear normal (MLN) distribution. Some properties, including the characteristic function and distribution of affine transformations, are given. An integral representation connecting densities of TE and MLN distributions is exhibited that is used in deriving the expectation of any measurable function of a TE variate.
    The purpose of this research is to identify and introduce effective factors in adoption of e-learning based on technology adoption model. Accordingly, by considering the studies conducted in this field, several variables such as computer... more
    The purpose of this research is to identify and introduce effective factors in adoption of e-learning based on technology adoption model. Accordingly, by considering the studies conducted in this field, several variables such as computer self-efficacy, content quality, system support, interface design, technology tools and computer anxiety as factors influencing the adoption of e-learning system were extracted and based on them, a conceptual model of research was developed. To measure the model and the relationships between the variables in the model, a questionnaire was designed and provided to users of the electronic education system of Qazvin University of Medical Sciences. The results of the data analysis confirmed the correctness of all hypotheses using the structural equation modeling method, except for the effect of technology tools on the acceptance of the e-learning system. The findings of this study will help university administrators and the professors associated with this system to encourage students to make effective use of the system by creating the necessary background for effective factors.
    The Bayesian variable selection analysis is widely used as a new methodology in air quality control trials and generalized linear models. One of the important and, of course, controversial topics in this area is the selection of prior... more
    The Bayesian variable selection analysis is widely used as a new methodology in air quality control trials and generalized linear models. One of the important and, of course, controversial topics in this area is the selection of prior distribution for unknown model parameters. The aim of this study is presenting a substitution for a mixture of priors, which besides preservation of benefits and computational efficiencies obviate the available paradoxes and contradictions. In this research, we pay attention to two points of view, empirical and fully Bayesian. Especially, a mixture of priors and its theoretical characteristics are given. Finally, the proposed model is illustrated with a real example.
    This paper considers an extension of the linear mixed model, called semiparametric mixed-effects model, for longitudinal data, when multicollinearity is present. To overcome this problem, a new mixed ridge estimator is proposed, while the... more
    This paper considers an extension of the linear mixed model, called semiparametric mixed-effects model, for longitudinal data, when multicollinearity is present. To overcome this problem, a new mixed ridge estimator is proposed, while the nonparametric function in the semiparametric model is approximated by the kernel method. The proposed approach integrates the ridge method into the semiparametric mixed-effects modeling framework to account for both the correlation induced by repeatedly measuring an outcome on each individual over time, as well as the potentially high degree of correlation among possible predictor variables. The asymptotic normality of the exhibited estimator is established. To improve efficiency, the estimation of the covariance function is accomplished using an iterative algorithm. Performance of the proposed estimator is compared through a simulation study and analysis of the CD4 data.
    The estimation of a quantile density function in the biased nonparametric regression model is investigated. We propose and develop a new waveletbased methodology for this problem. In particular, an adaptive hard thresholding wavelet... more
    The estimation of a quantile density function in the biased nonparametric regression model is investigated. We propose and develop a new waveletbased methodology for this problem. In particular, an adaptive hard thresholding wavelet estimator is constructed. Under mild assumptions on the model, we prove that it enjoys powerful mean integrated squared error properties over Besov balls. The performance of the proposed estimator is investigated by a numerical study.
    In the meta-analysis of clinical trials, usually the data of each trial summarized by one or more outcome measure estimates, which reported along with their standard errors. In the case that summary data are multi-dimensional. Usually,... more
    In the meta-analysis of clinical trials, usually the data of each trial summarized by one or more outcome measure estimates, which reported along with their standard errors. In the case that summary data are multi-dimensional. Usually, the data analysis will be performed in the form of several separated univariate analysis. In such a case, the correlation between summary statistics would be ignored. In contrast, a multivariate meta-analysis model, use from these correlations, synthesizes the outcomes, jointly to estimate the multiple pooled effects simultaneously. In this paper, we present a nonparametric Bayesian bivariate random-effect meta-analysis.
    A proper method of monitoring a stochastic system is to use the control charts of statistical process control in which a drift in characteristics of output may be due to one or several assignable causes. In the establishment of X charts... more
    A proper method of monitoring a stochastic system is to use the control charts of statistical process control in which a drift in characteristics of output may be due to one or several assignable causes. In the establishment of X charts in statistical process control, an assumption is made that there is no correlation within the samples. However, in practice, there are many cases where the correlation does exist within the samples. It would be more appropriate to assume that each sample is a realization of a multivariate normal random vector. Using three different loss functions in the concept of quality control charts with economic and economic statistical design leads to better decisions in the industry. Although some research works have considered the economic design of control charts under single assignable cause and correlated data, the economic statistical design of X control chart for multiple assignable causes and correlated data under Weibull shock model with three different loss functions have not been presented yet. Based on the optimization of the average cost per unit of time and taking into account the different combination values of Weibull distribution parameters, optimal design values of sample size, sampling interval and control limit coefficient were derived and calculated. Then the cost models under non-uniform and uniform sampling scheme were compared. The results revealed that the model under multiple assignable causes with correlated samples with non-uniform sampling integrated with three different loss functions has a lower cost than the model with uniform sampling.
    Imprecise measurement tools produce imprecise data. Interval-valued data is usually used to deal with such imprecisions. Therefore interval-valued variables are used in estimation methods. Linear regression models have recently modeled... more
    Imprecise measurement tools produce imprecise data. Interval-valued data is usually used to deal with such imprecisions. Therefore interval-valued variables are used in estimation methods. Linear regression models have recently modeled them. If the response variable has any statistical distributions, interval-valued variables are modeled under the generalized linear models framework. In this article, we propose a new consistent estimator of a parameter in the generalized linear model with regard to distribution of the response variable in the exponential family. A simulation study shows that the new estimator is better than others based on particular distribution of the response variable. We present the optimal properties of the estimators in this research.
    In this article, we consider the problem of estimating the stress-strength reliability P r(X > Y) based on upper record values when X and Y are two independent but not identically distributed random variables from the power hazard rate... more
    In this article, we consider the problem of estimating the stress-strength reliability P r(X > Y) based on upper record values when X and Y are two independent but not identically distributed random variables from the power hazard rate distribution with the common scale parameter k. When the parameter k is known, the maximum likelihood estimator (MLE), the approximate Bayes estimator, and the exact confidence intervals of stress-strength reliability are obtained. When the parameter k is unknown, we obtain the MLE and some bootstrap confidence intervals of the stress-strength reliability. We also apply the Gibbs sampling technique to study the Bayesian estimation of the stress-strength reliability and its corresponding credible interval. An example is presented to illustrate the inferences discussed in the previous sections. Finally, to investigate and compare the performance of the different proposed methods in this paper, a Monte Carlo simulation study is conducted.
    Traditionally, the statistical quality control techniques utilize either attributes or variables product quality measure. Recently, some methods, such as a threelevel control chart, have been developed for monitoring multi attribute... more
    Traditionally, the statistical quality control techniques utilize either attributes or variables product quality measure. Recently, some methods, such as a threelevel control chart, have been developed for monitoring multi attribute processes. Control chart usually has three design parameters: the sample size (n), the sampling interval (h) and the control limit coefficient (k).The design parameters of the control chart are generally specified according to statistical or/and economic criteria. The variable sampling interval (V SI) control scheme has been shown to provide an increase to the detecting efficiency of the control chart with a fixed sampling rate (F RS). In this paper, a method is proposed to conduct the economicstatistical design for a variable sampling interval of the three-level control charts. We use the cost model developed by Costa and Rahim and optimize this model by a genetic algorithm approach. We compare the expected cost per unit time of the V SI and F RS 3-level control charts. Results indicate that the proposed chart has improved performance.
    This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply a kernel density approach to estimate the persistence landscape. In addition, we evaluate the... more
    This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply a kernel density approach to estimate the persistence landscape. In addition, we evaluate the quality distribution function estimator of random variables using the integrated mean square error (IMSE). The results of the simulation studies show a significant improvement achieved by our approach compared to the standard version of confidence intervals algorithm. Finally, the real data analysis demonstrate the accuracy of our method compared to previous works for computing the confidence interval.
    The partial linear model is very flexible when the relationship between the covariates and responses, is either parametric or nonparametric. However, the estimation of the regression coefficients is challenging since one must also... more
    The partial linear model is very flexible when the relationship between the covariates and responses, is either parametric or nonparametric. However, the estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can be used. Here, suppose the regression vector-parameter is subjected to lie in a sub-space hypothesis. In situations where the use of difference-based least absolute and shrinkage selection operator (D-LASSO) is desirable, we propose a restricted D-LASSO estimator. To improve its performance, LASSO-type shrinkage estimators are also developed. The relative dominance picture of suggested estimators is investigated. In particular, the suitability of estimating the nonparametric component based on the Speckman approach is explored. A real data example is given to compare the proposed estimators. From the numerical analysis, it is obtained that the partial difference-based shrinkage estimators perform better than the difference-based regression model in average prediction error sense.
    In a referendum conducted in the United Kingdom (UK) on June 23, 2016, 51.6% of the participants voted to leave the European Union (EU). The outcome of this referendum had a major policy and financial impact for both UK and EU, and was... more
    In a referendum conducted in the United Kingdom (UK) on June 23, 2016, 51.6% of the participants voted to leave the European Union (EU). The outcome of this referendum had a major policy and financial impact for both UK and EU, and was seen as a surprise because the predictions consistently indicate that the "Remain" would get a majority. In this paper, we investigate whether the outcome of the Brexit referendum could have been predictable by polls data. The data consists of 233 polls that have been conducted between January 2014 and June 2016 by YouGov, Populus, ComRes, Opinion, and others. The sample size ranges from 500 to 20058. We used Singular Spectrum Analysis (SSA), which is an increasingly popular and widely adopted filtering technique for both short and long time series. We found that the real outcome of the referendum is very close to our point estimate and within our prediction interval, which reinforces the usefulness of the SSA to predict polls data.
    In various statistical model, such as density estimation and estimation of regression curves or hazardrates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametricstatistics is to estimate a... more
    In various statistical model, such as density estimation and estimation of regression curves or hazardrates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametricstatistics is to estimate a monotone density function f on a compact interval. A known estimator fordensity function of f under the restriction that f is decreasing, is Grenander estimator, where is the leftderivative of the least concave majorant of the empirical distribution function of the data. Many authorsworked on this estimator and obtained very useful properties from this estimator. Grenander estimatoris a step function and as a consequence it is not smooth. In this paper, we discuss the estimation of adecreasing density function by the kernel smoothing method. Many works have been done due to theimportance and applicability of Berry-Esseen bounds for the density estimator. In this paper, we studya Berry- Esseen type bound for a smoothed version of Grenander estimator.